Cutshort logo
Remote sql jobs

50+ Remote SQL Jobs in India

Apply to 50+ Remote SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneSQL JobsSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
Springer Capital
Remote only
0 - 1 yrs
₹5000 - ₹7000 / mo
PowerBI
Microsoft Excel
SQL
Attention to detail
Troubleshooting
+13 more

Springer Capital is a cross-border asset management firm specializing in real estate investment banking between China and the USA. We are offering a remote internship for aspiring data engineers interested in data pipeline development, data integration, and business intelligence.

The internship offers flexible start and end dates. A short quiz or technical task may be required as part of the selection process.

Responsibilities:

  • Design, build, and maintain scalable data pipelines for structured and unstructured data sources
  • Develop ETL processes to collect, clean, and transform data from internal and external systems
  • Support integration of data into dashboards, analytics tools, and reporting systems
  • Collaborate with data analysts and software developers to improve data accessibility and performance
  • Document workflows and maintain data infrastructure best practices
  • Assist in identifying opportunities to automate repetitive data tasks


Read more
VDart
Abirami Ramdoss
Posted by Abirami Ramdoss
Remote only
5 - 30 yrs
₹1L - ₹28L / yr
snowflake
Snow flake schema
SQL
SQL Azure
Data Warehouse (DWH)

Snowflake Developer

Fulltime

Remote

PAN india


 

·   Strong Familiarity with Data Warehouses (Snowflake, SQL)

·   Proficiency in Writing Performant SQL Queries/Scripts to generate business Insights and drive better organizational decision making

·   Experience with cloud Technologies (Ex: Snowflake, Azure)

·   Experience with Agile based development

·   Problem Solving Skills

·   Self-starter – willing to ask question.

Read more
SDS softwares

at SDS softwares

2 candid answers
1 recruiter
Tanavee Sharma
Posted by Tanavee Sharma
Remote only
0 - 1.5 yrs
₹1L - ₹2.2L / yr
skill iconHTML/CSS
skill iconJavascript
skill iconReact.js
skill iconNodeJS (Node.js)
skill icontailwindcss
+12 more

💼 Job Title: Full Stack Developer (*Fresher/experienced*)

🏢 Company: SDS Softwares

💻 Location: Work from Home

💸 Salary range: ₹7,000 - ₹18,000 per month (based on knowledge and interview)

🕛 Shift Timings: 12 PM to 9 PM


About the role: As a Full Stack Developer, you will work on both the front-end and back-end of web applications. You will be responsible for developing user-friendly interfaces and maintaining the overall functionality of our projects.


⚜️ Key Responsibilities:

- Collaborate with cross-functional teams to define, design, and ship new features.

- Develop and maintain high-quality web applications (frontend + backend )

- Troubleshoot and debug applications to ensure peak performance.

- Participate in code reviews and contribute to the team’s knowledge base.


⚜️ Required Skills:

- Proficiency in HTML, CSS, JavaScript, React.js for front-end development. ✅

- Understanding of server-side languages such as Node.js, Python, or PHP. ✅

- Familiarity with database technologies such as MySQL, MongoDB, or ✅ PostgreSQL.

- Basic knowledge of version control systems, particularly Git.

- Strong problem-solving skills and attention to detail.

- Excellent communication skills and a team-oriented mindset.


💠 Qualifications:

- Recent graduates or individuals with internship experience (6 months to 1.5years) in software development.

- Must have a personal laptop and stable internet connection.

- Ability to join immediately is preferred.


If you are passionate about coding and eager to learn, we would love to hear from you. 👍


Read more
Ekloud INC
Remote only
6 - 14 yrs
₹18L - ₹22L / yr
Salesforce
Test Automation (QA)
Automation
Manual testing
SOQL
+9 more

Job description:

 

6+ years of hands on experience with both Manual and Automated testing with strong preference of experience using AccelQ on Salesforce ans SAP platforms.

 

Proven expertise in Salesforce particularly within the Sales Cloud module.

 

Proficient in writing complex SOQL and SQL queries for data validation and backend testing.

 

Extensive experience in designing and developing robust, reusable automated test scripts for Salesforce environments.

 

Highly skilled at early issue detection, with a deep understanding of backend configurations, process flows and validation rules.

 

Should have a strong background in Salesforce testing, with hands-on experience in automation tools such as Selenium, Provar, or TestNG. 


You will be responsible for creating and maintaining automated test scripts, executing test cases, identifying bugs, and ensuring the quality and reliability of Salesforce applications. 

 

A solid understanding of Salesforce modules (Sales Cloud, Service Cloud, etc.) and APIs is essential. 

 

Experience with CI/CD tools like Jenkins and version control systems like Git is preferred. 

 

You will work closely with developers, business analysts, and stakeholders to define test strategies and improve the overall QA process. 

Read more
CARE INFOTECH
Zeeshan Sheikh
Posted by Zeeshan Sheikh
Remote only
8 - 10 yrs
₹6L - ₹8L / yr
Architecture
API
J2EE
Integration
SQL
+10 more

Requirements:

• Should have total experience of 8+ years.

• Good understanding of Overall Calypso Architecture

• Hands-on experience on V17 of Calypso is mandatory

• Hands on Calypso development and customizations , like Scheduled Tasks, Reports, Engines, Messages,

Accounting

• Understanding and application of Calypso APIs in detail

• Understanding and hands-on experience of J2EE - Spring is added mandatory

• Good analytical skills

• Good understanding of Calypso message and sender frameworks and accounting framework

• Ability to write SQL queries and understanding Calypso database tables

• Good and effective communication skills

• Experience on working in Agile methodology

• Should have experience working with Jira or version One or Rally

• Should have leadership skills

• Should have hands-on experience of Leading Calypso Development teams for more than 3-4 years

• Should have experience of executing development/customization projects from analysis, designing,

estimation, coding, unit testing, supporting UAT and BUAT, to Go-live to Production support

• Lead the daily/weekly meetings with client

• Should have experience preparing Status/Progress reports

• Should have experience of working in DevOps enabled environment including Jenkins, Code Quality &

Control tools, DVCS, Git

Read more
CARE INFOTECH
Zeeshan Sheikh
Posted by Zeeshan Sheikh
Remote only
10 - 12 yrs
₹7L - ₹9L / yr
skill iconJava
Microservices
API
SQL Azure
Windows Azure
+7 more

Requirements:

• Strong expertise in Java (Java 8 and Java 17 or higher).

• Proficiency in frameworks like Spring Boot, Microservice Architecture.

• Experience in cloud-native development and deployment on Microsoft Azure.

• Hands-on experience with Azure services such as Azure App Services, Functions,

• Kubernetes (AKS), Azure DevOps, Blob Storage, and Service Bus.

• Knowledge of RESTful APIs, SOAP, and microservices architecture.

• Solid understanding of database technologies (e.g., Azure SQL)

Read more
CARE INFOTECH
Zeeshan Sheikh
Posted by Zeeshan Sheikh
Remote only
4 - 6 yrs
₹5L - ₹7L / yr
Team leadership
Test cases
Status reports
Calypso
Debugging
+10 more

Mandatory skills

• Experience of leading a team

• Hands-on experience of status/progress reporting with apt Test Metrics

• Coordination with BAs/SMEs/Developers, Management

• Experience in Capital Markets

• Knowledge of Derivatives is a must

Calypso Product Knowledge

• Experience on V17 (Good to have)

• Problem/issue investigation and resolution

• Good communication skills

• Both written as well as verbal

• Experience of working in Agile

Read more
CARE INFOTECH
Zeeshan Sheikh
Posted by Zeeshan Sheikh
Remote only
4 - 6 yrs
₹6L - ₹8L / yr
calypso
Workflow
BO
skill iconJava
SQL
+2 more

Mandatory skills

• Calypso, Java, SQL, Unix, Eclipse

• Awareness of Calypso Architecture (including engines, servers, build and deployment) – preferably V17 and

above

• Good understanding of the capital markets domain and awareness of FO/MO/BO functions

• Awareness of the pre-requisites and trade lifecycle for either Rates/Commodity/FX/Credit/Equity asset class

• Experience on Static and Reference data setup – LE, Books, SDI, Trade Filter, Static Data Filter, etc.

• Experience on workflow and workflow customizations for Trade/Messages/Transfers

• Experience on BO customizations – Messages, Reports, Scheduled Task, Task Station

• Experience on implementation for an interfacing application (custom/out-of-the-box) –

MarkitWire,Reuters,DTCC,Acadia,etc.

• Analysis, communication and estimation skills

• Good documentation and unit testing skills

• Ability to train resources

Read more
Springer Capital
Andrew Rose
Posted by Andrew Rose
Remote only
0 - 1 yrs
₹5000 - ₹7000 / mo
Attention to detail
Troubleshooting
Data modeling
Warehousing concepts
Google Cloud Platform (GCP)
+15 more

Springer Capital is a cross-border asset management firm specializing in real estate investment banking between China and the USA. We are offering a remote internship for aspiring data engineers interested in data pipeline development, data integration, and business intelligence. The internship offers flexible start and end dates. A short quiz or technical task may be required as part of the selection process. 

 

Responsibilities: 

▪ Design, build, and maintain scalable data pipelines for structured and unstructured data sources 

▪ Develop ETL processes to collect, clean, and transform data from internal and external systems 

▪ Support integration of data into dashboards, analytics tools, and reporting systems 

▪ Collaborate with data analysts and software developers to improve data accessibility and performance 

▪ Document workflows and maintain data infrastructure best practices 

▪ Assist in identifying opportunities to automate repetitive data tasks 

Read more
venanalytics

at venanalytics

2 candid answers
Rincy jain
Posted by Rincy jain
Remote only
2 - 4 yrs
₹5L - ₹12L / yr
skill iconDjango
skill iconPython
SQL
skill iconReact.js
skill iconHTML/CSS

Role Objective

 

Develop business relevant, high quality, scalable web applications. You will be part of a dynamic AdTech team solving big problems in the Media and Entertainment Sector.

 

Roles & Responsibilities

 

* Application Design: Understand requirements from the user, create stories and be a part of the design team. Check designs, give regular feedback and ensure that the designs are as per user expectations.  

 

* Architecture: Create scalable and robust system architecture. The design should be in line with the client infra. This could be on-prem or cloud (Azure, AWS or GCP). 

 

* Development: You will be responsible for the development of the front-end and back-end. The application stack will comprise of (depending on the project) SQL, Django, Angular/React, HTML, CSS. Knowledge of GoLang and Big Data is a plus point.  

 

* Deployment: Suggest and implement a deployment strategy that is scalable and cost-effective. Create a detailed resource architecture and get it approved. CI/CD deployment on IIS or Linux. Knowledge of dockers is a plus point.  

 

* Maintenance: Maintaining development and production environments will be a key part of your job profile. This will also include trouble shooting, fixing bugs and suggesting ways for improving the application. 

 

* Data Migration: In the case of database migration, you will be expected to suggest appropriate strategies and implementation plans. 

 

* Documentation: Create a detailed document covering important aspects like HLD, Technical Diagram, Script Design, SOP etc. 

 

* Client Interaction: You will be interacting with the client on a day-to-day basis and hence having good communication skills is a must. 

 

Requirements

 

Education-B. Tech (Comp. Sc, IT) or equivalent 

 

Experience- 3+ years of experience developing applications on Django, Angular/React, HTML and CSS 

 

Behavioural Skills-

  1. Clear and Assertive communication 

  2. Ability to comprehend the business requirement  

  3. Teamwork and collaboration 

  4. Analytics thinking 

  5. Time Management 

  6. Strong troubleshooting and problem-solving skills 

 

Technical Skills-

  1. Back-end and Front-end Technologies: Django, Angular/React, HTML and CSS. 

  2. Cloud Technologies: AWS, GCP, and Azure 

  3. Big Data Technologies: Hadoop and Spark 

  4. Containerized Deployment: Dockers and Kubernetes is a plus.

  5. Other: Understanding of Golang is a plus.

 

Read more
Deltek
shwetha V
Posted by shwetha V
Remote only
2 - 3 yrs
Best in industry
skill iconPHP
skill iconPython
Powershell
Bash
SQL
+1 more

Ops Analysts/Sys Admin

Company Summary :

As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com

Ops Analysts/Sys Admin

Company Summary :

As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com

External Job Title :

Systems Engineer 1

Position Responsibilities :

We are seeking a highly skilled and motivated System Engineer to join our team. Apart from a strong technical background, excellent problem-solving abilities, and a collaborative mindset the ideal candidate will be a self-starter with a high level of initiative and a passion for experimentation. This role requires someone who thrives in a fast-paced environment and is eager to take on new challenges.

 

  • Technical Skills:

Must Have Skills:

  • PHP
  • SQL; Relational Database Concepts
  • At least one scripting language (Python, PowerShell, Bash, UNIX, etc.)
  • Experience with Learning and Utilizing APIs

Nice to Have Skills:

  • Experience with AI Initiatives & exposure of GenAI and/or Agentic AI projects
  • Microsoft Power Apps
  • Microsoft Power BI 
  • Atomic
  • Snowflake
  • Cloud-Based Application Development 
  • Gainsight
  • Salesforce 

 

  • Soft Skills:

Must Have Skills:

  • Flexible Mindset for Solution Development 
  • Independent and Self-Driven; Autonomous
  • Investigative; drives toward resolving Root Cause of Stakeholder needs instead of treating Symptoms; Critical Thinker
  • Collaborative mindset to drive best results 

Nice to Have Skills:

  • Business Acumen (Very Nice to Have)


 

  • Responsibilities:
  • Develop and maintain system solutions to meet stakeholder needs.
  • Collaborate with team members and stakeholders to ensure effective communication and teamwork.
  • Independently drive projects and tasks to completion with minimal supervision.
  • Investigate and resolve root causes of issues, applying critical thinking to develop effective solutions.
  • Adapt to changing requirements and maintain a flexible approach to solution development.

Qualifications :

  • A college degree in Computer Science, Software Engineering, Information Science or a related field is required 
  • Minimum 2-3 years of experience programming skills on PHP, Powe BI or Snowflake, Python and API Integration.
  • Proven experience in system engineering or a related field.
  • Strong technical skills in the required areas.
  • Excellent problem-solving and critical thinking abilities.
  • Ability to work independently and as part of a team.
  • Strong communication and collaboration skills.

Read more
IT services and consulting

IT services and consulting

Agency job
via Myhashtaggs by Ravikanth Dangeti
Remote only
4 - 8 yrs
₹10L - ₹12L / yr
skill iconMachine Learning (ML)
skill iconDeep Learning
Algorithms
skill iconPython
PowerBI
+5 more



Remote Job Opportunity

Job Title: Data Scientist

Contract Duration: 6 months+

Location: offshore India


Work Time: 3 pm to 12 am


Must have 4+ Years of relevant experience.


Job Summary:

We are seeking an AI Data Scientist with a strong foundation in machine learning, deep learning, and statistical modeling to design, develop, and deploy cutting-edge AI solutions.

The ideal candidate will have expertise in building and optimizing AI models, with a deep understanding of both statistical theory and modern AI techniques. You will work on high-impact projects, from prototyping to production, collaborating with engineers, researchers, and business stakeholders to solve complex problems using AI.


Key Responsibilities:

Research, design, and implement machine learning and deep learning models for predictive and generative AI applications.

Apply advanced statistical methods to improve model robustness and interpretability.

Optimize model performance through hyperparameter tuning, feature engineering, and ensemble techniques.

Perform large-scale data analysis to identify patterns, biases, and opportunities for AI-driven automation.

Work closely with ML engineers to validate, train, and deploy the models.

Stay updated with the latest research and developments in AI and machine learning to ensure innovative and cutting-edge solutions.


Qualifications & Skills:

Education: PhD or Master's degree in Statistics, Mathematics, Computer Science, or a related field.


Experience:

4+ years of experience in machine learning and deep learning, with expertise in algorithm development and optimization.

Proficiency in SQL, Python and visualization tools ( Power BI).

Experience in developing mathematical models for business applications, preferably in finance, trading, image-based AI, biomedical modeling, or recommender systems industries

Strong communication skills to interact effectively with both technical and non-technical stakeholders.

Excellent problem-solving skills with the ability to work independently and as part of a team.

Read more
venanalytics

at venanalytics

2 candid answers
Rincy jain
Posted by Rincy jain
Remote only
2 - 3 yrs
₹7L - ₹12L / yr
SQL
PowerBI
skill iconPython
Big Data

About the Role:

We are looking for a highly skilled Data Engineer with a strong foundation in Power BI, SQL, Python, and Big Data ecosystems to help design, build, and optimize end-to-end data solutions. The ideal candidate is passionate about solving complex data problems, transforming raw data into actionable insights, and contributing to data-driven decision-making across the organization.


Key Responsibilities:

  • Data Modelling & Visualization
  • Build scalable and high-quality data models in Power BI using best practices.
  • Define relationships, hierarchies, and measures to support effective storytelling.
  • Ensure dashboards meet standards in accuracy, visualization principles, and timelines.
  • Data Transformation & ETL
  • Perform advanced data transformation using Power Query (M Language) beyond UI-based steps.
  • Design and optimize ETL pipelines using SQL, Python, and Big Data tools.
  • Manage and process large-scale datasets from various sources and formats.
  • Business Problem Translation
  • Collaborate with cross-functional teams to translate complex business problems into scalable, data-centric solutions.
  • Decompose business questions into testable hypotheses and identify relevant datasets for validation.
  • Performance & Troubleshooting
  • Continuously optimize performance of dashboards and pipelines for latency, reliability, and scalability.
  • Troubleshoot and resolve issues related to data access, quality, security, and latency, adhering to SLAs.
  • Analytical Storytelling
  • Apply analytical thinking to design insightful dashboards—prioritizing clarity and usability over aesthetics.
  • Develop data narratives that drive business impact.
  • Solution Design
  • Deliver wireframes, POCs, and final solutions aligned with business requirements and technical feasibility.

Required Skills & Experience:

  • Minimum 3+ years of experience as a Data Engineer or in a similar data-focused role.
  • Strong expertise in Power BI: data modeling, DAXPower Query (M Language), and visualization best practices.
  • Hands-on with Python and SQL for data analysis, automation, and backend data transformation.
  • Deep understanding of data storytelling, visual best practices, and dashboard performance tuning.
  • Familiarity with DAX Studio and Tabular Editor.
  • Experience in handling high-volume data in production environments.

Preferred (Good to Have):

  • Exposure to Big Data technologies such as:
  • PySpark
  • Hadoop
  • Hive / HDFS
  • Spark Streaming (optional but preferred)

Why Join Us?

  • Work with a team that's passionate about data innovation.
  • Exposure to modern data stack and tools.
  • Flat structure and collaborative culture.
  • Opportunity to influence data strategy and architecture decisions.

 

Read more
Ekloud INC
Ankita G
Posted by Ankita G
Remote only
6 - 9 yrs
₹18L - ₹25L / yr
Salesforce
sales cloud
accelq
Test Automation (QA)
Automated testing
+2 more

Job Title: Salesforce QA Engineer

 

Experience: 6+ Years

 

Location: Bangalore - Hybrid (Manyata Tech Park)

 

Job description:

 

6+ years of hands on experience with both Manual and Automated testing with strong preference of experience using AccelQ on Salesforce ans SAP platforms.

 

Proven expertise in Salesforce particularly within the Sales Cloud module.

 

Proficient in writing complex SOQL and SQL queries for data validation and backend testing.

 

Extensive experience in designing and developing robust, reusable automated test scripts for Salesforce environments.

 

Highly skilled at early issue detection, with a deep understanding of backend configurations, process flows and validation rules.

 

Should have a strong background in Salesforce testing, with hands-on experience in automation tools such as Selenium, Provar, or TestNG. 

You will be responsible for creating and maintaining automated test scripts, executing test cases, identifying bugs, and ensuring the quality and reliability of Salesforce applications. 

 

A solid understanding of Salesforce modules (Sales Cloud, Service Cloud, etc.) and APIs is essential. 

 

Experience with CI/CD tools like Jenkins and version control systems like Git is preferred. 

 

You will work closely with developers, business analysts, and stakeholders to define test strategies and improve the overall QA process. 

Read more
FloBiz
Agency job
via AccioJob by AccioJobHiring Board
Remote only
0 - 0 yrs
₹12L - ₹15L / yr
SQL
RESTful APIs
Object Oriented Programming (OOPs)
DSA

AccioJob is conducting a Walk-In Hiring Drive with FloBiz for the position of Backend Intern.


To apply, register and select your slot here: https://go.acciojob.com/dkfKBz


Required Skills: SQL, RestAPI, OOPs, DSA


Eligibility:

  • Degree: BTech./BE, BCA, BSc.
  • Branch: Computer Science/CSE/Other CS related branch, IT
  • Graduation Year: 2025, 2026


Work Details:

  • Work Location: (Remote)
  • CTC: ₹12 LPA to ₹15 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Skill Centres Noida, Pune, Chennai, Hyderabad, Bangalore


Further Rounds (for shortlisted candidates only):

Profile & Background Screening Round, Technical Interview Round 1, Technical Interview Round 2, Cultural Fit Round

Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/dkfKBz

Or apply in seconds — straight from our brand-new app!

https://go.acciojob.com/L6rH7C


Read more
Fountane inc
HR Fountane
Posted by HR Fountane
Remote only
5 - 9 yrs
₹18L - ₹32L / yr
skill iconAmazon Web Services (AWS)
AWS Lambda
AWS CloudFormation
ETL
skill iconDocker
+3 more

Position Overview: We are looking for an experienced and highly skilled Senior Data Engineer to join our team and help design, implement, and optimize data systems that support high-end analytical solutions for our clients. As a customer-centric Data Engineer, you will work closely with clients to understand their business needs and translate them into robust, scalable, and efficient technical solutions. You will be responsible for end-to-end data modelling, integration workflows, and data transformation processes while ensuring security, privacy, and compliance.In this role, you will also leverage the latest advancements in artificial intelligence, machine learning, and large language models (LLMs) to deliver high-impact solutions that drive business success. The ideal candidate will have a deep understanding of data infrastructure, optimization techniques, and cost-effective data management


Key Responsibilities:


• Customer Collaboration:

– Partner with clients to gather and understand their business

requirements, translating them into actionable technical specifications.

– Act as the primary technical consultant to guide clients through data challenges and deliver tailored solutions that drive value.


•Data Modeling & Integration:

– Design and implement scalable, efficient, and optimized data models to support business operations and analytical needs.

– Develop and maintain data integration workflows to seamlessly extract, transform, and load (ETL) data from various sources into data repositories.

– Ensure smooth integration between multiple data sources and platforms, including cloud and on-premise systems


• Data Processing & Optimization:

– Develop, optimize, and manage data processing pipelines to enable real-time and batch data processing at scale.

– Continuously evaluate and improve data processing performance, optimizing for throughput while minimizing infrastructure costs.


• Data Governance & Security:

–Implement and enforce data governance policies and best practices, ensuring data security, privacy, and compliance with relevant industry regulations (e.g., GDPR, HIPAA).

–Collaborate with security teams to safeguard sensitive data and maintain privacy controls across data environments.


• Cross-Functional Collaboration:

– Work closely with data engineers, data scientists, and business

analysts to ensure that the data architecture aligns with organizational objectives and delivers actionable insights.

– Foster collaboration across teams to streamline data workflows and optimize solution delivery.


• Leveraging Advanced Technologies:

– Utilize AI, machine learning models, and large language models (LLMs) to automate processes, accelerate delivery, and provide

smart, data-driven solutions to business challenges.

– Identify opportunities to apply cutting-edge technologies to improve the efficiency, speed, and quality of data processing and analytics.


• Cost Optimization:

–Proactively manage infrastructure and cloud resources to optimize throughput while minimizing operational costs.

–Make data-driven recommendations to reduce infrastructure overhead and increase efficiency without sacrificing performance.


Qualifications:


• Experience:

– Proven experience (5+ years) as a Data Engineer or similar role, designing and implementing data solutions at scale.

– Strong expertise in data modelling, data integration (ETL), and data transformation processes.

– Experience with cloud platforms (AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark).


• Technical Skills:

– Advanced proficiency in SQL, data modelling tools (e.g., Erwin,PowerDesigner), and data integration frameworks (e.g., Apache

NiFi, Talend).

– Strong understanding of data security protocols, privacy regulations, and compliance requirements.

– Experience with data storage solutions (e.g., data lakes, data warehouses, NoSQL, relational databases).


• AI & Machine Learning Exposure:

– Familiarity with leveraging AI and machine learning technologies (e.g., TensorFlow, PyTorch, scikit-learn) to optimize data processing and analytical tasks.

–Ability to apply advanced algorithms and automation techniques to improve business processes.


• Soft Skills:

– Excellent communication skills to collaborate with clients, stakeholders, and cross-functional teams.

– Strong problem-solving ability with a customer-centric approach to solution design.

– Ability to translate complex technical concepts into clear, understandable terms for non-technical audiences.


• Education:

– Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or a related field (or equivalent practical experience).


LIFE AT FOUNTANE:

  • Fountane offers an environment where all members are supported, challenged, recognized & given opportunities to grow to their fullest potential.
  • Competitive pay
  • Health insurance for spouses, kids, and parents.
  • PF/ESI or equivalent
  • Individual/team bonuses
  • Employee stock ownership plan
  • Fun/challenging variety of projects/industries
  • Flexible workplace policy - remote/physical
  • Flat organization - no micromanagement
  • Individual contribution - set your deadlines
  • Above all - culture that helps you grow exponentially!


A LITTLE BIT ABOUT THE COMPANY:

Established in 2017, Fountane Inc is a Ventures Lab incubating and investing in new competitive technology businesses from scratch. Thus far, we’ve created half a dozen multi-million valuation companies in the US and a handful of sister ventures for large corporations, including Target, US Ventures, and Imprint Engine.

We’re a team of 120+ strong from around the world that are radically open-minded and believes in excellence, respecting one another, and pushing our boundaries to the furthest it's ever been.

Read more
KGISL MICROCOLLEGE
Agency job
via EDU TECH by Srimathi Balamurugan
Remote, Kochi (Cochin)
1 - 5 yrs
₹2L - ₹6L / yr
Business Analysis
SQL
MS-Excel
Tableau
PowerBI

We are looking for a passionate and experienced Business Analyst Trainer to join our training team. This role involves delivering high-quality training programs on business analysis tools, methodologies, and best practices, both in-person and online.

Read more
Hypersonix Inc

at Hypersonix Inc

2 candid answers
1 product
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote only
7yrs+
Upto ₹30L / yr (Varies
)
skill iconData Analytics
SQL
MS-Excel
skill iconPython
skill iconR Programming
+5 more

About the Role:

We are looking for a Senior Technical Customer Success Manager to join our growing team. This is a client-facing role focused on ensuring successful adoption and value realization of our SaaS solutions. The ideal candidate will come from a strong analytics background, possess hands-on skills in SQL and Python or R, and have experience working with dashboarding tools. Prior experience in eCommerce or retail domains is a strong plus.


Responsibilities:

  • Own post-sale customer relationship and act as the primary technical point of contact.
  • Drive product adoption and usage through effective onboarding, training, and ongoing support.
  • Work closely with clients to understand business goals and align them with product capabilities.
  • Collaborate with internal product, engineering, and data teams to deliver solutions and enhancements tailored to client needs.
  • Analyze customer data and usage trends to proactively identify opportunities and risks.
  • Build dashboards or reports for customers using internal tools or integrations.
  • Lead business reviews, share insights, and communicate value delivered.
  • Support customers in configuring rules, data integrations, and troubleshooting issues.
  • Drive renewal and expansion by ensuring customer satisfaction and delivering measurable outcomes.


Requirements:

  • 7+ years of experience in a Customer Success, Technical Account Management, or Solution Consulting role in a SaaS or software product company.
  • Strong SQL skills and working experience with Python or R.
  • Experience with dashboarding tools such as Tableau, Power BI, Looker, or similar.
  • Understanding of data pipelines, APIs, and data modeling.
  • Excellent communication and stakeholder management skills.
  • Proven track record of managing mid to large enterprise clients.
  • Experience in eCommerce, retail, or consumer-facing businesses is highly desirable.
  • Ability to translate technical details into business context and vice versa.
  • Bachelor’s or Master’s degree in Computer Science, Analytics, Engineering, or related field.


Nice to Have:

  • Exposure to machine learning workflows, recommendation systems, or pricing analytics.
  • Familiarity with cloud platforms (AWS/GCP/Azure).
  • Experience working with cross-functional teams in Agile environments.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Remote only
10 - 15 yrs
₹10L - ₹18L / yr
Solution architecture
Denodo
Data Virtualization
Data architecture
SQL
+5 more

Job Title : Solution Architect – Denodo

Experience : 10+ Years

Location : Remote / Work from Home

Notice Period : Immediate joiners preferred


Job Overview :

We are looking for an experienced Solution Architect – Denodo to lead the design and implementation of data virtualization solutions. In this role, you will work closely with cross-functional teams to ensure our data architecture aligns with strategic business goals. The ideal candidate will bring deep expertise in Denodo, strong technical leadership, and a passion for driving data-driven decisions.


Mandatory Skills : Denodo, Data Virtualization, Data Architecture, SQL, Data Modeling, ETL, Data Integration, Performance Optimization, Communication Skills.


Key Responsibilities :

  • Architect and design scalable data virtualization solutions using Denodo.
  • Collaborate with business analysts and engineering teams to understand requirements and define technical specifications.
  • Ensure adherence to best practices in data governance, performance, and security.
  • Integrate Denodo with diverse data sources and optimize system performance.
  • Mentor and train team members on Denodo platform capabilities.
  • Lead tool evaluations and recommend suitable data integration technologies.
  • Stay updated with emerging trends in data virtualization and integration.

Required Qualifications :

  • Bachelor’s degree in Computer Science, IT, or a related field.
  • 10+ Years of experience in data architecture and integration.
  • Proven expertise in Denodo and data virtualization frameworks.
  • Strong proficiency in SQL and data modeling.
  • Hands-on experience with ETL processes and data integration tools.
  • Excellent communication, presentation, and stakeholder management skills.
  • Ability to lead technical discussions and influence architectural decisions.
  • Denodo or data architecture certifications are a strong plus.
Read more
Remote only
4 - 6 yrs
₹10L - ₹15L / yr
skill iconAngular (2+)
skill icon.NET
SQL
Relational Database (RDBMS)
Dependency injection

.NET + Angular Full Stack Developer (4–5 Years Experience)

Location: Pune/Remote

Experience Required: 4 to 5 years

Communication: Fluent English (verbal & written)

Technology: .NET, Angular

Only immediate joiners who can start on 21st July should apply.


Job Overview

We are seeking a skilled and experienced Full Stack Developer with strong expertise in .NET (C#) and Angular to join our dynamic team in Pune. The ideal candidate will have hands-on experience across the full development stack, a strong understanding of relational databases and SQL, and the ability to work independently with clients. Experience in microservices architecture is a plus.


Key Responsibilities

  • Design, develop, and maintain modern web applications using .NET Core / .NET Framework and Angular
  • Write clean, scalable, and maintainable code for both backend and frontend components
  • Interact directly with clients for requirement gathering, demos, sprint planning, and issue resolution
  • Work closely with designers, QA, and other developers to ensure high-quality product delivery
  • Perform regular code reviews, ensure adherence to coding standards, and mentor junior developers if needed
  • Troubleshoot and debug application issues and provide timely solutions
  • Participate in discussions on architecture, design patterns, and technical best practices

Must-Have Skills

✅ Strong hands-on experience with .NET Core / .NET Framework (Web API, MVC)

✅ Proficiency in Angular (Component-based architecture, RxJS, State Management)

✅ Solid understanding of RDBMS and SQL (preferably with SQL Server)

✅ Familiarity with Entity Framework or Dapper

✅ Strong knowledge of RESTful API design and integration

✅ Version control using Git

✅ Excellent verbal and written communication skills

✅ Ability to work in a client-facing role and handle discussions independently

Good-to-Have / Optional Skills

Understanding or experience in Microservices Architecture

Exposure to CI/CD pipelines, unit testing frameworks, and cloud environments (e.g., Azure or AWS)

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shrutika SaileshKumar
Posted by Shrutika SaileshKumar
Remote, Bengaluru (Bangalore)
5 - 9 yrs
Best in industry
skill iconPython
SDET
BDD
SQL
Data Warehouse (DWH)
+2 more

Primary skill set: QA Automation, Python, BDD, SQL 

As Senior Data Quality Engineer you will:

  • Evaluate product functionality and create test strategies and test cases to assess product quality.
  • Work closely with the on-shore and the offshore team.
  • Work on multiple reports validation against the databases by running medium to complex SQL queries.
  • Better understanding of Automation Objects and Integrations across various platforms/applications etc.
  • Individual contributor exploring opportunities to improve performance and suggest/articulate the areas of improvements importance/advantages to management.
  • Integrate with SCM infrastructure to establish a continuous build and test cycle using CICD tools.
  • Comfortable working on Linux/Windows environment(s) and Hybrid infrastructure models hosted on Cloud platforms.
  • Establish processes and tools set to maintain automation scripts and generate regular test reports.
  • Peer review to provide feedback and to make sure the test scripts are flaw-less.

Core/Must have skills:

  • Excellent understanding and hands on experience in ETL/DWH testing preferably DataBricks paired with Python experience.
  • Hands on experience SQL (Analytical Functions and complex queries) along with knowledge of using SQL client utilities effectively.
  • Clear & crisp communication and commitment towards deliverables
  • Experience on BigData Testing will be an added advantage.
  • Knowledge on Spark and Scala, Hive/Impala, Python will be an added advantage.

Good to have skills:

  • Test automation using BDD/Cucumber / TestNG combined with strong hands-on experience with Java with Selenium. Especially working experience in WebDriver.IO
  • Ability to effectively articulate technical challenges and solutions
  • Work experience in qTest, Jira, WebDriver.IO


Read more
Hypersonix Inc

at Hypersonix Inc

2 candid answers
1 product
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote only
7yrs+
Upto ₹40L / yr (Varies
)
SQL
skill iconPython
ETL
Data engineering
Big Data
+2 more

About the Company

Hypersonix.ai is disrupting the e-commerce space with AI, ML and advanced decision capabilities to drive real-time business insights. Hypersonix.ai has been built ground up with new age technology to simplify the consumption of data for our customers in various industry verticals. Hypersonix.ai is seeking a well-rounded, hands-on product leader to help lead product management of key capabilities and features.


About the Role

We are looking for talented and driven Data Engineers at various levels to work with customers to build the data warehouse, analytical dashboards and ML capabilities as per customer needs.


Roles and Responsibilities

  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements; should write complex queries in an optimized way
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
  • Run ad-hoc analysis utilizing the data pipeline to provide actionable insights
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions
  • Work with analytics and data scientist team members and assist them in building and optimizing our product into an innovative industry leader


Requirements

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • A successful history of manipulating, processing and extracting value from large disconnected datasets
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
  • Experience supporting and working with cross-functional teams in a dynamic environment
  • We are looking for a candidate with 7+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Information Technology or completed MCA.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Remote only
5 - 10 yrs
₹10L - ₹22L / yr
Business Analysis
Healthcare
Requirements management
User stories
Gap analysis
+11 more

Position : Business Analyst

Experience : 5+ Years

Location : Remote

Notice Period : Immediate Joiners Preferred (or candidates serving 10–15 days’ notice)

Interview Mode : Virtual


Job Description :

We are seeking an experienced Business Analyst with a strong background in requirements gathering, functional documentation, and stakeholder management, particularly in the US Healthcare payer domain.


Mandatory Skills :

Business Analysis, US Healthcare Payer Domain, Requirement Gathering, User Stories, Gap & Impact Analysis, Azure DevOps/TFS, SQL, UML Modeling, SDLC/STLC, System Testing, UAT, Strong Communication Skills.


Key Responsibilities :

  • Analyze and understand complex business and functional requirements.
  • Translate business needs into detailed User Stories, functional and technical specifications.
  • Conduct gap analysis and impact assessment for new and existing product features.
  • Create detailed documentation including scope, project plans, and secure stakeholder approvals.
  • Support System Testing and User Acceptance Testing (UAT) from a functional perspective.
  • Prepare and maintain release notes, end-user documentation, training materials, and process flows.
  • Serve as a liaison between business and technical teams, ensuring cross-functional alignment.
  • Assist with sprint planning, user story tracking, and status updates using Azure DevOps / TFS.
  • Write and execute basic SQL queries for data validation and analysis.

Required Skills :

  • Minimum 5 years of experience as a Business Analyst.
  • Strong analytical, problem-solving, and communication skills.
  • Solid understanding of Project Life Cycle, STLC, and UML modeling.
  • Prior experience in US Healthcare payer domain is mandatory.
  • Familiarity with tools like Azure DevOps / TFS.
  • Ability to work with urgency, manage priorities, and maintain attention to detail.
  • Strong team collaboration and stakeholder management.
Read more
Parksmart
Agency job
via Parksmart by Saurav Kumar
Remote, Noida
0 - 1 yrs
₹10000 - ₹15000 / mo
skill iconNodeJS (Node.js)
skill iconAmazon Web Services (AWS)
skill iconReact.js
SQL
skill iconMongoDB
+1 more


🚀 We're Urgently Hiring – Node.js Backend Development Intern

Join our backend team as an intern and get hands-on experience building scalable, real-world applications with Node.js, Firebase, and AWS.

📍 Remote / Onsite

📍 📅 Duration: 2 Months


🔧 What You’ll Work On:

Backend development using Node.js

Firebase, SQL & NoSQL database management

RESTful API integration

Deployment on AWS infrastructure


Read more
Deltek
shwetha V
Posted by shwetha V
Remote only
4 - 7 yrs
Best in industry
Product support
Technical support
SQL
Customer Support


Support Services Analyst

Company Summary :

As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com


Business Summary :

Deltek’s award winning Support Services team provides best-in-class assistance to Deltek’s customers across the world via phone, chat and email. Our team is comprised of a group of diverse, collaborative and passionate professionals who come from varying industries, backgrounds and professions. Our diversity and passion is our strength, so however you identify and whatever background you bring, we invite you to explore our team as a potential next step in your career!


External Job Title :

Support Services Analyst

Position Responsibilities :

  • Serves as the second level of support to all the customers’ queries. 
  • Resolves the queries and issues by ensuring that all assigned requests are addressed within the SLA or escalate it further as required. 
  • Take ownership and responsibility of issues from start through to a successful resolution and follow the escalation process, to speed up the resolution. 
  • Effective & efficient working in partnership with other departments to prevent delay in resolution 
  • Applying technology in multiple ways to configure the product and helping the customer implement Replicons products. 
  • Solid understanding of product limits and suggesting ways of improving the product 
  • Logically understanding the concepts of other SaaS based products for integration requests. 
  • Need to be multi skilled in all three mediums (phone. Chats and emails.) 

Qualifications :

  • Any Bachelor's Degree 
  • At least 2 years of experience in software application support and/or infrastructure support 
  • Basic understanding of Web technology, basic networking & hardware knowledge, and software applications 
  • Excellent communication skills - verbal, written, listening skills and interpersonal skills. 
  • Ability to communicate in a tactful, courteous manner and to deal with and resolve complex situations in a professional manner 
  • Ability to handle multiple tasks/projects simultaneously and effectively work individually or in a team environment 
  • Open to work in a 24/7 support environment 


Read more
FiftyFive Technologies Pvt Ltd
Remote, Indore, Jaipur, Gurugram
5 - 10 yrs
₹10L - ₹25L / yr
skill iconNodeJS (Node.js)
skill iconReact.js
GraphQL
skill iconMongoDB
SQL
+2 more

Job Overview:


We are looking for a Full-Stack Developer with 4+ years of experience in software development. The ideal candidate will be proficient in both frontend and backend technologies, capable of building scalable and high-performance applications, and have a problem-solving mindset. You will collaborate with cross-functional teams to develop, optimize, and maintain web applications.

 

Key Responsibilities :

 

- Design, develop, and maintain web applications ensuring performance and scalability.  

- Work with backend services using Node.js (Express.js/NestJS) and databases.  

- Develop and maintain frontend applications using React.js (minimum 1 year experience required).  

- Integrate APIs (RESTful & GraphQL) and third-party services.  

- Write clean, maintainable, and efficient code following industry best practices.  

- Ensure security, reliability, and optimization in applications.  

- Participate in debugging, troubleshooting, and performance tuning.  

- Work closely with designers, product managers, and engineers to deliver high-quality solutions.  

- Stay updated with modern development trends and contribute to technical improvements.  

 

Required Skills & Qualifications :  


- 4+ years of experience in full-stack development.  

- Strong proficiency in JavaScript and TypeScript.  

- Hands-on experience with Node.js (Express.js/NestJS).  

- Minimum 1 year of experience working with React.js.  

- Experience with SQL (PostgreSQL, MySQL) and NoSQL (MongoDB) databases.  

- Proficiency in API design, development, and integration (RESTful, GraphQL).  

- Familiarity with version control tools (Git, GitHub/GitLab/Bitbucket).  

- Strong problem-solving and analytical skills.  

- Ability to work both independently and collaboratively in a team.  

 

Good to Have : 


- Experience with Cloud Services (AWS, Azure, or GCP).  

- Familiarity with Containerization (Docker, Kubernetes).  

- Knowledge of testing frameworks (Jest, Mocha, or Cypress).  

- Understanding of event-driven architectures and message queues (Kafka, RabbitMQ).

Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Remote, Kochi (Cochin), Trivandrum
8 - 15 yrs
₹10L - ₹24L / yr
skill iconJava
skill iconSpring Boot
skill iconPython
skill iconAngular (2+)
skill iconAmazon Web Services (AWS)
+7 more

Job Title : Technical Architect

Experience : 8 to 12+ Years

Location : Trivandrum / Kochi / Remote

Work Mode : Remote flexibility available

Notice Period : Immediate to max 15 days (30 days with negotiation possible)


Summary :

We are looking for a highly skilled Technical Architect with expertise in Java Full Stack development, cloud architecture, and modern frontend frameworks (Angular). This is a client-facing, hands-on leadership role, ideal for technologists who enjoy designing scalable, high-performance, cloud-native enterprise solutions.


🛠 Key Responsibilities :

  • Architect scalable and high-performance enterprise applications.
  • Hands-on involvement in system design, development, and deployment.
  • Guide and mentor development teams in architecture and best practices.
  • Collaborate with stakeholders and clients to gather and refine requirements.
  • Evaluate tools, processes, and drive strategic technical decisions.
  • Design microservices-based solutions deployed over cloud platforms (AWS/Azure/GCP).

Mandatory Skills :

  • Backend : Java, Spring Boot, Python
  • Frontend : Angular (at least 2 years of recent hands-on experience)
  • Cloud : AWS / Azure / GCP
  • Architecture : Microservices, EAI, MVC, Enterprise Design Patterns
  • Data : SQL / NoSQL, Data Modeling
  • Other : Client handling, team mentoring, strong communication skills

Nice to Have Skills :

  • Mobile technologies (Native / Hybrid / Cross-platform)
  • DevOps & Docker-based deployment
  • Application Security (OWASP, PCI DSS)
  • TOGAF familiarity
  • Test-Driven Development (TDD)
  • Analytics / BI / ML / AI exposure
  • Domain knowledge in Financial Services or Payments
  • 3rd-party integration tools (e.g., MuleSoft, BizTalk)

⚠️ Important Notes :

  • Only candidates from outside Hyderabad/Telangana and non-JNTU graduates will be considered.
  • Candidates must be serving notice or joinable within 30 days.
  • Client-facing experience is mandatory.
  • Java Full Stack candidates are highly preferred.

🧭 Interview Process :

  1. Technical Assessment
  2. Two Rounds – Technical Interviews
  3. Final Round
Read more
Hunarstreet Technologies pvt ltd

Hunarstreet Technologies pvt ltd

Agency job
via Hunarstreet Technologies pvt ltd by Sakshi Patankar
Remote only
10 - 20 yrs
₹15L - ₹30L / yr
Data engineering
databricks
skill iconPython
skill iconScala
Spark
+14 more

What You’ll Be Doing:

● Design and build parts of our data pipeline architecture for extraction, transformation, and loading of data from a wide variety of data sources using the latest Big Data technologies.

● Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

● Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

● Work with machine learning, data, and analytics experts to drive innovation, accuracy and greater functionality in our data system. Qualifications:

● Bachelor's degree in Engineering, Computer Science, or relevant field.

● 10+ years of relevant and recent experience in a Data Engineer role. ● 5+ years recent experience with Apache Spark and solid understanding of the fundamentals.

● Deep understanding of Big Data concepts and distributed systems.

● Strong coding skills with Scala, Python, Java and/or other languages and the ability to quickly switch between them with ease.

● Advanced working SQL knowledge and experience working with a variety of relational databases such as Postgres and/or MySQL.

● Cloud Experience with DataBricks

● Experience working with data stored in many formats including Delta Tables, Parquet, CSV and JSON.

● Comfortable working in a linux shell environment and writing scripts as needed.

● Comfortable working in an Agile environment

● Machine Learning knowledge is a plus.

● Must be capable of working independently and delivering stable, efficient and reliable software.

● Excellent written and verbal communication skills in English.

● Experience supporting and working with cross-functional teams in a dynamic environment


EMPLOYMENT TYPE: Full-Time, Permanent

LOCATION: Remote (Pan India)

SHIFT TIMINGS: 2.00 pm-11:00pm IST 

Read more
Automate Accounts

at Automate Accounts

2 candid answers
Namrata Das
Posted by Namrata Das
Remote only
5 - 10 yrs
₹10L - ₹20L / yr
skill iconPython
SQL
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconGitHub
+2 more

Responsibilities


Develop and maintain web and backend components using Python, Node.js, and Zoho tools


Design and implement custom workflows and automations in Zoho


Perform code reviews to maintain quality standards and best practices


Debug and resolve technical issues promptly


Collaborate with teams to gather and analyze requirements for effective solutions


Write clean, maintainable, and well-documented code


Manage and optimize databases to support changing business needs


Contribute individually while mentoring and supporting team members


Adapt quickly to a fast-paced environment and meet expectations within the first month



Leadership Opportunities


Lead and mentor junior developers in the team


Drive projects independently while collaborating with the broader team


Act as a technical liaison between the team and stakeholders to deliver effective solutions



Selection Process


1. HR Screening: Review of qualifications and experience


2. Online Technical Assessment: Test coding and problem-solving skills


3. Technical Interview: Assess expertise in web development, Python, Node.js, APIs, and Zoho


4. Leadership Evaluation: Evaluate team collaboration and leadership abilities


5. Management Interview: Discuss cultural fit and career opportunities


6. Offer Discussion: Finalize compensation and role specifics



Experience Required


5- 7 years of relevant experience as a Software Developer


Proven ability to work as a self-starter and contribute individually


Strong technical and interpersonal skills to support team members effectively

Read more
BigRio
Disha Bhardwaj
Posted by Disha Bhardwaj
Remote only
7 - 14 yrs
₹14L - ₹15L / yr
Documentation
Scripting
testing
skill iconJavascript
SQL
+2 more

Solution Engineer                                                             


 

Primary Responsibilities

●         Serve as the primary resource during the client implementation/onboarding phase

 

●         Identify, document, and define customer business and technical needs

 

●         Develop clear user documentation, instructions, and standard procedures

 

●         Deliver training sessions on solution administration and usage

 

●         Participate in customer project calls and serve as a subject matter expert on solutions

 

●         Coordinate tasks across internal and client project teams, ensuring accountability and progress tracking

 

●         Perform hands-on configuration, scripting, data imports, testing, and knowledge transfer activities

 

●         Translate business requirements into technical specifications for product configuration or enhancements

 

●         Collaborate with global team members across multiple time zones, including the U.S., India, and China

 

●         Build and maintain strong customer relationships to gather and validate requirements


●         Contribute to the development of implementation best practices and suggest improvements to processes

 

●         Execute other tasks and duties as assigned

 

 

Note: Salary offered will depend on the candidate's qualifications and experience.




 

Required Skills & Experience

●         Proven experience leading software implementation projects from presales through delivery

 

●         Strong organizational skills with the ability to manage multiple detailed and interdependent tasks

 

●         2–5 years of experience in JavaScript and web development, including prior implementation work in a software company

 

●         Proficiency in some or all of the following:

 

○         JavaScript, PascalScript, MS SQL Script, RESTful APIs, Azure, Postman

 

○         Embarcadero RAD Studio, Delphi

 

○         Basic SQL and debugging

 

○         SMS integration and business intelligence tools

 

●         General knowledge of database structures and data migration processes

 

●         Familiarity with project management tools and methodologies

 

●         Strong interpersonal skills with a focus on client satisfaction and relationship-building

 

●         Self-starter with the ability to work productively in a remote, distributed team environment

 

●         Experience in energy efficiency retrofits, construction, or utility demand-side management is a plus

Read more
BigRio
Disha Bhardwaj
Posted by Disha Bhardwaj
Remote only
12 - 17 yrs
₹35L - ₹45L / yr
skill iconC#
SQL
skill iconJava
Microsoft Windows Azure

Responsibilities:


●        Technical Leadership:                                                                                                     

○        Architect and design complex software systems

○        Lead the development team in implementing software solutions

○        Ensure adherence to coding standards and best practices

○        Conduct code reviews and provide constructive feedback

○        Troubleshoot and resolve technical issues

●        Project Management:                                                                                                       

○        Collaborate with project managers to define project scope and requirements

○        Estimate project timelines and resource needs

○        Track project progress and ensure timely delivery

○        Manage risks and identify mitigation strategies

●        Team Development:                                                                                                         

○        Mentor and coach junior developers

○        Foster a collaborative and supportive team environment

○        Conduct performance evaluations and provide feedback

○        Identify training and development opportunities for team members

●        Innovation:                                                                                                                          

○        Stay abreast of emerging technologies and industry trends

○        Evaluate and recommend new technologies for adoption

○        Encourage experimentation and innovation within the team

 

Qualifications                                                                                                                                 

 

●        Experience:                                                                                                                        

○        12+ years of experience in software development

○        4+ years of experience in a leadership role

○        Proven track record of delivering successful software projects

●        Skills:                                                                                                                                    

○        Strong proficiency in C# programming languages


○        Good knowledge on Java for reporting

○        Strong on SQL - Microsoft azure

○        Expertise in software development methodologies (e.g., Agile, Scrum)

○        Excellent problem-solving and analytical skills

○        Strong communication and interpersonal skills

○        Ability to work independently and as part of a team

 

                                                                                                   


Read more
RockED

at RockED

2 candid answers
Kashish Trehan
Posted by Kashish Trehan
Remote only
4 - 9 yrs
₹15L - ₹40L / yr
skill iconNodeJS (Node.js)
MySQL
skill iconJavascript
SQL
skill iconExpress
+3 more

Your Impact

  • Build scalable backend services.
  • Design, implement, and maintain databases, ensuring data integrity, security, and efficient retrieval.
  • Implement the core logic that makes applications work, handling data processing, user requests, and system operations
  • Contribute to the architecture and design of new product features
  • Optimize systems for performance, scalability, and security
  • Stay up-to-date with new technologies and frameworks, contributing to the advancement of software development practices
  • Working closely with product managers and designers to turn ideas into reality and shape the product roadmap.

What skills do you need?


  • 4+ years of experience in backend development, especially building robust APIS using Node.js, Express.js, MYSQL
  • Strong command of JavaScript and understanding of its quirks and best practices
  • Ability to think strategically when designing systems—not just how to build, but why
  • Exposure to system design and interest in building scalable, high-availability systems
  • Prior work on B2C applications with a focus on performance and user experience
  • Ensure that applications can handle increasing loads and maintain performance, even under heavy traffic
  • Work with complex queries for performing sophisticated data manipulation, analysis, and reporting.
  • Knowledge of Sequelize, MongoDB and AWS would be an advantage.
  • Experience in optimizing backend systems for speed and scalability.


Read more
Astegic

at Astegic

3 recruiters
Agency job
via Hunarstreet Technologies pvt ltd by Priyanka Londhe
Remote only
10 - 13 yrs
₹30L - ₹50L / yr
skill iconScala
Apache Spark
Big Data
skill iconPython
skill iconJava
+3 more

POSITION:

Senior Data Engineer

The Senior Data Engineer will be responsible for building and extending our data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys working with big data and building systems from the ground up.

You will collaborate with our software engineers, database architects, data analysts and data scientists to ensure our data delivery architecture is consistent throughout the platform. You must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.


What You’ll Be Doing:

● Design and build parts of our data pipeline architecture for extraction, transformation, and loading of data from a wide variety of data sources using the latest Big Data technologies.

● Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

● Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

● Work with machine learning, data, and analytics experts to drive innovation, accuracy and greater functionality in our data system.

Qualifications:

● Bachelor's degree in Engineering, Computer Science, or relevant

field.

● 10+ years of relevant and recent experience in a Data Engineer role.

● 5+ years recent experience with Apache Spark and solid understanding of the fundamentals.

● Deep understanding of Big Data concepts and distributed systems.

● Strong coding skills with Scala, Python, Java and/or other languages and the ability to quickly switch between them with ease.

● Advanced working SQL knowledge and experience working with a variety of relational databases such as Postgres and/or MySQL.

● Cloud Experience with DataBricks

● Experience working with data stored in many formats including Delta Tables, Parquet, CSV and JSON.

● Comfortable working in a linux shell environment and writing scripts as needed.

● Comfortable working in an Agile environment

● Machine Learning knowledge is a plus.

● Must be capable of working independently and delivering stable,

efficient and reliable software.

● Excellent written and verbal communication skills in English.

● Experience supporting and working with cross-functional teams in a dynamic environment.

REPORTING: This position will report to our CEO or any other Lead as assigned by Management.

EMPLOYMENT TYPE: Full-Time, Permanent LOCATION: Remote (Pan India) SHIFT TIMINGS: 2.00 pm-11:00pm IST

WHO WE ARE:

SalesIntel is the top revenue intelligence platform on the market. Our combination of automation and researchers allows us to reach 95% data accuracy for all our published contact data, while continuing to scale up our number of contacts. We currently have more than 5 million human-verifi ed contacts, another 70 million plus machine processed contacts, and the highest number of direct dial contacts in the industry. We guarantee our accuracy with our well-trained research team that re-verifi es every direct dial number, email, and contact every 90 days. With the most comprehensive contact and company data and our excellent customer service, SalesIntel has the best B2B data available. For more information, please visit – www.salesintel.io

WHAT WE OFFER: SalesIntel’s workplace is all about diversity. Different countries and cultures are represented in our workforce. We are growing at a fast pace and our work environment is constantly evolving with changing times. We motivate our team to better themselves by offering all the good stuff you’d expect like Holidays, Paid Leaves, Bonuses, Incentives, Medical Policy and company paid Training Programs.

SalesIntel is an Equal Opportunity Employer. We prohibit discrimination and harassment of any type and offer equal employment opportunities to employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law.

Read more
Remote only
4 - 10 yrs
₹20L - ₹30L / yr
Artificial Intelligence (AI)
Large Language Models (LLM) tuning
skill iconPython
SQL
Retrieval Augmented Generation (RAG)
+10 more

Knowledge of Gen AI technology ecosystem including top tier LLMs, prompt engineering, knowledge of development frameworks such as LLMaxindex and LangChain, LLM fine tuning and experience in architecting RAGs and other LLM based solution for enterprise use cases. 1. Strong proficiency in programming languages like Python and SQL. 2. 3+ years of experience of predictive/prescriptive analytics including Machine Learning algorithms (Supervised and Unsupervised) and deep learning algorithms and Artificial Neural Networks such as Regression , classification, ensemble model,RNN,LSTM,GRU. 3. 2+ years of experience in NLP, Text analytics, Document AI, OCR, sentiment analysis, entity recognition, topic modeling 4. Proficiency in LangChain and Open LLM frameworks to perform summarization, classification, Name entity recognition, Question answering 5. Proficiency in Generative techniques prompt engineering, Vector DB, LLMs such as OpenAI,LlamaIndex, Azure OpenAI, Open-source LLMs will be important 6. Hands-on experience in GenAI technology areas including RAG architecture, fine tuning techniques, inferencing frameworks etc 7. Familiarity with big data technologies/frameworks 8. Sound knowledge of Microsoft Azure

Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Remote only
10 - 12 yrs
₹10L - ₹20L / yr
Symfony
skill iconPHP
SQL
skill iconAmazon Web Services (AWS)
skill iconJavascript

Profile: Senior PHP Developer

Experience- 10+Years

Mode: Remote

Required Skills:

  • PHP (10+ years) & Symfony framework (5+ years)
  • Team leadership experience (3+ years)
  • OOP, design patterns, RESTful APIs
  • Database optimization (MySQL/PostgreSQL)
  • Git, CI/CD, testing frameworks
  • Excellent communication skills

Responsibilities:

  • Lead PHP/Symfony development
  • Mentor team members
  • Ensure code quality through reviews
  • Collaborate with stakeholders
  • Manage sprint cycles
  • Optimize application performance


Read more
Quanteon Solutions
DurgaPrasad Sannamuri
Posted by DurgaPrasad Sannamuri
Remote, Hyderabad
5 - 8 yrs
₹25L - ₹35L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconJavascript
SQL
skill iconMongoDB
+5 more

We are looking for a highly skilled Senior Software Engineer with over 5 years of experience in full stack development using React.js and Node.js. As a senior member of our engineering team, you’ll take ownership of complex technical challenges, influence architecture decisions, mentor junior developers, and contribute to high-impact products.


Key Responsibilities:

Design, build, and maintain scalable web applications using React.js (frontend) and Node.js (backend).

Architect robust, secure, and scalable backend APIs and frontend components.

Collaborate closely with Product Managers, Designers, and DevOps to deliver end-to-end features.

Conduct code reviews, enforce best practices, and guide junior developers.

Optimize application performance, scalability, and responsiveness.

Troubleshoot, debug, and upgrade existing systems.

Stay current with new technologies and advocate for continuous improvement.


Required Qualifications:

Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.

5+ years of experience in full stack development.

Strong expertise in React.js and related libraries (Redux, Hooks, etc.).

In-depth experience with Node.js, Express.js, and RESTful APIs.

Proficiency with JavaScript/TypeScript and modern frontend tooling (Webpack, Babel, etc.).

Experience with relational and NoSQL databases (e.g., PostgreSQL, MongoDB).

Solid understanding of CI/CD, testing (Jest, Mocha), and version control (Git).

Familiarity with cloud services (AWS/GCP/Azure) and containerization (Docker, Kubernetes) is a plus.

Excellent communication and problem-solving skills.


Nice to Have:

Experience with microservices architecture.

Knowledge of GraphQL.

Exposure to serverless computing.

Prior experience working in Agile/Scrum teams.

Read more
 Zazmic Inc

Zazmic Inc

Agency job
Remote only
5 - 8 yrs
₹10L - ₹15L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
databricks
skill iconPython
SQL
+4 more

Title: Data Engineer II (Remote – India/Portugal)

Exp: 4- 8 Years

CTC: up to 30 LPA


Required Skills & Experience:

  • 4+ years in data engineering or backend software development
  • AI / ML is important
  • Expert in SQL and data modeling
  • Strong Python, Java, or Scala coding skills
  • Experience with Snowflake, Databricks, AWS (S3, Lambda)
  • Background in relational and NoSQL databases (e.g., Postgres)
  • Familiar with Linux shell and systems administration
  • Solid grasp of data warehouse concepts and real-time processing
  • Excellent troubleshooting, documentation, and QA mindset


If interested, kindly share your updated CV to 82008 31681

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Remote only
5 - 7 yrs
₹12L - ₹16L / yr
skill iconPython
Google Cloud Platform (GCP)
SQL
PySpark
Data Transformation Tool (DBT)
+2 more

Role: GCP Data Engineer

Notice Period: Immediate Joiners

Experience: 5+ years

Location: Remote

Company: Deqode


About Deqode

At Deqode, we work with next-gen technologies to help businesses solve complex data challenges. Our collaborative teams build reliable, scalable systems that power smarter decisions and real-time analytics.


Key Responsibilities

  • Build and maintain scalable, automated data pipelines using Python, PySpark, and SQL.
  • Work on cloud-native data infrastructure using Google Cloud Platform (BigQuery, Cloud Storage, Dataflow).
  • Implement clean, reusable transformations using DBT and Databricks.
  • Design and schedule workflows using Apache Airflow.
  • Collaborate with data scientists and analysts to ensure downstream data usability.
  • Optimize pipelines and systems for performance and cost-efficiency.
  • Follow best software engineering practices: version control, unit testing, code reviews, CI/CD.
  • Manage and troubleshoot data workflows in Linux environments.
  • Apply data governance and access control via Unity Catalog or similar tools.


Required Skills & Experience

  • Strong hands-on experience with PySpark, Spark SQL, and Databricks.
  • Solid understanding of GCP services (BigQuery, Cloud Functions, Dataflow, Cloud Storage).
  • Proficiency in Python for scripting and automation.
  • Expertise in SQL and data modeling.
  • Experience with DBT for data transformations.
  • Working knowledge of Airflow for workflow orchestration.
  • Comfortable with Linux-based systems for deployment and troubleshooting.
  • Familiar with Git for version control and collaborative development.
  • Understanding of data pipeline optimization, monitoring, and debugging.
Read more
Remote, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
2 - 5 yrs
₹3L - ₹9L / yr
SQL
skill iconXML
JSON
TDL

Job Description:

As a Tally Developer, your main responsibility will be to develop custom solutions in Tally using TDL as per the customer requirements. You will work closely with clients, business analysts, Senior developers, and other stakeholders to understand their requirements and translate them into effective Tally-based solutions.

Responsibilities:

Collaborate business analysts and senior developer/project manager to gather and analyses client requirements.

Design, develop, and customize Tally-based software solutions to meet the specific requirements of clients.

Write efficient and well-documented code in Tally Definition Language (TDL) to extend the functionality of Tally software.

Follow the Software Development Life Cycle including requirements gathering, design, coding, testing, and deployment.

Troubleshoot and debug issues related to Tally customization, data import/export, and software integrations.

Provide technical support and assistance to clients and end-users in utilizing and troubleshooting Tally-based software solutions.

Stay updated with the latest features and updates in Tally software to leverage new functionalities in solution development.

Adhere to coding standards, documentation practices, and quality assurance processes.

Requirements:

Any Degree. Relevant work experience may be considered in place of a degree.

Experience in Tally development and customization for projects using Tally Definition Language (TDL).

Hands-on experience in Tally and implementation of its features.

Familiarity with database systems, data structures, and SQL for efficient data management and retrieval.

Strong problem-solving skills and attention to detail.

Good communication and teamwork abilities.

Continuous learning mindset to keep up with advancements in Tally software and related technologies.

Key Skills Required:

TDL (Tally Definition Language), Tally, Excel, XML/JSON.

Good to have Basic Skills:

Database like MS SQL, MySQL

API Integration.

WORK EXPERIENCE- MINIMUM 2 YEARS AND MAXIMUM 7 YEARS

Interested candidate may what's app their cv on TRIPLE NINE ZERO NINE THREE DOUBLE ONE DOUBLE FOURE.


Please answer the below question?

Do you have knowledge of Tally Definition Language?

How many experience do you have as TDL?

Read more
Pattem Digital Technologies
Sanchari Sharma
Posted by Sanchari Sharma
Remote only
5 - 10 yrs
₹10L - ₹20L / yr
SQL
adobe campaign classic tool

The Consultant / Senior Consultant – Adobe Campaign is a technical role that requires providing Consulting advice and support to Clients for Implementing Adobe Campaign solution and any technical advisory required afterwards. This is a client-facing role and requires consultant to liaise with the client, understand their technical and business requirements and then Implement Adobe Campaign solution in a manner client gets most value out of the solution. Consultant’s main objective is to drive successful delivery and maintaining a high level of satisfaction for our customer.

What you need to succeed

• Expertise and Experience in SQL (Oracle / SQL Server / PostgreSQL) • Programming experience (Javascript / Java / VB / C# / PHP)

• Knowledge on Web Technologies like HTML, CSS would be a plus

• Good communication skills to ensure effective customer interactions, communications, and documentation

• Self-starter - Organized and highly motivated

• Fast learner, ability to learn new technologies/languages

• Knowledge of HTML DOM manipulation and page load events a plus

• Project Management skills a plus

• Ability to develop creative solutions to problems

• Able to multi-task in a dynamic environment

• Able to work independently with minimal supervision

• Experience leading team members will be a plus Adobe is an equal opportunity/affirmative action employer. We welcome and encourage diversity in the workplace.


Read more
Cymetrix Software

at Cymetrix Software

2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Remote only
3 - 9 yrs
₹12L - ₹24L / yr
Looker
lookML
bigquery
SQL
Google Cloud Platform (GCP)

Proficient in Looker Action, Looker Dashboarding, Looker Data Entry, LookML, SQL Queries, BigQuery, LookML, Looker Studio, BigQuery, GCP.



Remote Working

2 pm to 12 am IST or

10:30 AM to 7:30 PM IST

Sunday to Thursday



Responsibilities:

● Create and maintain LookML code, which defines data models, dimensions, measures, and relationships within Looker.

● Develop reusable LookML components to ensure consistency and efficiency in report and dashboard creation.

● Build and customize dashboard to Incorporate data visualizations, such as charts and graphs, to present insights effectively.

● Write complex SQL queries when necessary to extract and manipulate data from underlying databases and also optimize SQL queries for performance.

● Connect Looker to various data sources, including databases, data warehouses, and external APIs.

● Identify and address bottlenecks that affect report and dashboard loading times and Optimize Looker performance by tuning queries, caching strategies, and exploring indexing options.

● Configure user roles and permissions within Looker to control access to sensitive data & Implement data security best practices, including row-level and field-level security.

● Develop custom applications or scripts that interact with Looker's API for automation and integration with other tools and systems.

● Use version control systems (e.g., Git) to manage LookML code changes and collaborate with other developers.

● Provide training and support to business users, helping them navigate and use Looker effectively.

● Diagnose and resolve technical issues related to Looker, data models, and reports.


Skills Required:

● Experience in Looker's modeling language, LookML, including data models, dimensions, and measures.

● Strong SQL skills for writing and optimizing database queries across different SQL databases (GCP/BQ preferable)

● Knowledge of data modeling best practices

● Proficient in BigQuery, billing data analysis, GCP billing, unit costing, and invoicing, with the ability to recommend cost optimization strategies.

● Previous experience in Finops engagements is a plus

● Proficiency in ETL processes for data transformation and preparation.

● Ability to create effective data visualizations and reports using Looker’s dashboard tools.

● Ability to optimize Looker performance by fine-tuning queries, caching strategies, and indexing.

● Familiarity with related tools and technologies, such as data warehousing (e.g., BigQuery ), data transformation tools (e.g., Apache Spark), and scripting languages (e.g., Python).

Read more
Remote only
3 - 5 yrs
₹4L - ₹7L / yr
Netsuite
SOAP
Suite Script
SuiteScript2.0
ODBC
+1 more

Job Summary:

SiGa Systems are looking for a skilled and motivated Software Developer with expertise in NetSuite API and ODBC integrations. The ideal candidate will design, develop, and maintain robust data integration solutions to seamlessly move data between NetSuite and external database systems. This role demands a deep understanding of NetSuite’s data model, SuiteTalk APIs, ODBC connectivity, and strong programming skills for data manipulation and integration.

  • Key Responsibilities:1. NetSuite API DevelopmentDesign and implement custom integrations using NetSuite SuiteTalk REST and SOAP APIs.
  • Develop efficient, scalable scripts using SuiteScript 1.0 and 2.x.
  • Build and maintain Suitelets, Scheduled Scripts, User Event Scripts, and other custom NetSuite components.
  • Troubleshoot and resolve issues related to NetSuite API connections and data workflows.
  • 2. ODBC Data IntegrationSet up and manage ODBC connections for accessing NetSuite data.
  • Write complex SQL queries and stored procedures for ETL (Extract, Transform, Load) processes.
  • Design and execute data synchronization workflows between NetSuite and external databases (e.g., SQL Server, MySQL, PostgreSQL).
  • Ensure optimal performance and data accuracy across systems.
  • 3. Data Modeling & Database ManagementAnalyze NetSuite data models and design efficient schemas for target systems.
  • Perform data mapping, transformation, and migration tasks.
  • Ensure data consistency and integrity throughout integration pipelines.
  • Monitor database performance and maintain system reliability.
  • 4. Software Development & DocumentationWrite clean, maintainable, and well-documented code.
  • Participate in code reviews and contribute to coding best practices.
  • Maintain technical documentation, including API specs, integration flows, and data mapping docs.
  • Use version control systems (e.g., Git) for collaboration and code management.
  • 5. Collaboration & CommunicationWork closely with business analysts, project managers, and cross-functional teams to understand integration requirements.
  • Provide technical guidance and regular progress updates to stakeholders.
  • Participate actively in Agile development processes and contribute to sprint planning and retrospectives.


Read more
Agivant Technologies

Agivant Technologies

Agency job
via Vidpro Consultancy Services by ashik thahir
Remote only
5 - 10 yrs
₹18L - ₹25L / yr
skill iconPython
SQL
Airflow
Snowflake
skill iconElastic Search
+3 more

Experience: 5-8 Years

Work Mode: Remote

Job Type: Fulltime

Mandatory Skills: Python,SQL, Snowflake, Airflow, ETL, Data Pipelines, Elastic Search, & AWS.


Role Overview:

We are looking for a talented and passionate Senior Data Engineer to join our growing data team. In this role, you will play a key part in building and scaling our data infrastructure, enabling data-driven decision-making across the organization. You will be responsible for designing, developing, and maintaining efficient and reliable data pipelines for both ELT (Extract, Load, Transform) and ETL (Extract, Transform, Load) processes.


Responsibilities:

  • Design, develop, and maintain robust and scalable data pipelines for ELT and ETL processes, ensuring data accuracy, completeness, and timeliness.
  • Work with stakeholders to understand data requirements and translate them into efficient data models and pipelines.
  • Build and optimize data pipelines using a variety of technologies, including Elastic Search, AWS S3, Snowflake, and NFS.
  • Develop and maintain data warehouse schemas and ETL/ELT processes to support business intelligence and analytics needs.
  • Implement data quality checks and monitoring to ensure data integrity and identify potential issues.
  • Collaborate with data scientists and analysts to ensure data accessibility and usability for various analytical purposes.
  • Stay current with industry best practices, CI/CD/DevSecFinOps, Scrum and emerging technologies in data engineering.
  • Contribute to the development and enhancement of our data warehouse architecture

Required Skills:

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience as a Data Engineer with a strong focus on ELT/ETL processes.
  • At least 3+ years of exp in Snowflake data warehousing technologies.
  • At least 3+ years of exp in creating and maintaining Airflow ETL pipelines.
  • Minimum 3+ years of professional level experience with Python languages for data manipulation and automation.
  • Working experience with Elastic Search and its application in data pipelines.
  • Proficiency in SQL and experience with data modelling techniques.
  • Strong understanding of cloud-based data storage solutions such as AWS S3.
  • Experience working with NFS and other file storage systems.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.


Read more
Sun King

at Sun King

2 candid answers
1 video
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote only
1yr+
Best in industry
skill iconJava
skill iconSpring Boot
J2EE
Microservices
Hibernate (Java)
+6 more

About Sun King

Sun King is the world’s leading off-grid solar energy company, delivering energy access to 1.8 billion people without reliable grid connections through innovative product design, fintech solutions, and field operations.

Key highlights:

  • Connected over 20 million homes to solar power across Africa and Asia, adding 200,000 homes monthly.
  • Affordable ‘pay-as-you-go’ financing model; after 1-2 years, customers own their solar equipment.
  • Saved customers over $4 billion to date.
  • Collect 650,000 daily payments via 28,000 field agents using mobile money systems.
  • Products range from home lighting to high-energy appliances, with expansion into clean cooking, electric mobility, and entertainment.

With 2,800 staff across 12 countries, our team includes experts in various fields, all passionate about serving off-grid communities.

Diversity Commitment:

44% of our workforce are women, reflecting our commitment to gender diversity.


About the role:

The Backend Developer works remotely as part of the technology team to help Sun King’s EasyBuy business unit design and develop software to improve its field team operations.


What you will be expected to do

  • Design and develop applications/systems based on wireframes and product requirements documents. 
  • Design and develop logical and physical data models to meet application requirements. 
  • Identify and resolve bottlenecks and bugs based on operational requirements.
  • Perform unit tests on code to ensure robustness, including edge cases, usability, and general reliability. 
  • Write reusable and easily maintainable code following the principles of DRY (Don’t Repeat Yourself). 
  • Integrate existing tools and business systems, both in-house and external services, such as ticketing software and communication tools. 
  • Collaborate with team members and product managers to understand project requirements and contribute to the overall system design. 


You might be a strong candidate if you have/are

  • Have development experience: 1-2 years backend development experience and have strong problem-solving abilities, proficiency in data structures, and algorithms. 
  • Have a profound grasp of object-oriented programming (OOPS) standards and expertise in Core Java. 
  • Have knowledge of SQL, MySQL, or similar database management. 
  • Have Experience in integrating web services, such as SOAP, REST, JSON, and XML. 
  • Have familiarity with RESTful APIs for linking Android applications to backend services. 
  • Have preferred experience with version control systems like Git, but not mandatory. 
  • Have additional knowledge of web technologies like HTML, CSS, JavaScript, and frameworks like Spring or Hibernate would be advantageous. 


What we offer (in addition to compensation and statutory benefits):

  • A platform for professional growth in a rapidly expanding, high-impact sector.
  • Immerse in a collaborative culture, energized by employees of Sun King who are collectively motivated by fostering a transformative, sustainable venture.
  • A genuinely global environment: Engage and learn alongside a diverse group from varied geographies and backgrounds.
  • Tailored learning pathways through the Sun King Center for Leadership to elevate your leadership and managerial capabilities.
Read more
HashRoot
Agency job
via HashRoot by Deepak S
Remote only
4 - 15 yrs
₹6L - ₹15L / yr
Windows Azure
DevOps
SQL
Shell Scripting
Bash
+2 more

Overview

As an engineer in the Service Operations division, you will be responsible for the day-to-day management of the systems and services that power client products. Working with your team, you will ensure daily tasks and activities are successfully completed and where necessary, use standard operating procedures and knowledge to resolve any faults/errors encountered.


Job Description

Key Tasks and Responsibilities:

Ensure daily tasks and activities have successfully completed. Where this is not the case, recovery and remediation steps will be undertaken.

Undertake patching and upgrade activities in support of ParentPay compliance programs. These being PCI DSS, ISO27001 and Cyber Essentials+.

Action requests from the ServiceNow work queue that have been allocated to your relevant resolver group. These include incidents, problems, changes and service requests.

Investigate alerts and events detected from the monitoring systems that indicate a change in component health.

Create and maintain support documentation in the form of departmental wiki and ServiceNow knowledge articles that allow for continual improvement of fault detection and recovery times.

Work with colleagues to identify and champion the automation of all manual interventions undertaken within the team.

Attend and complete all mandatory training courses.

Engage and own the transition of new services into Service Operations.

Participate in the out of hours on call support rota.


Qualifications and Experience:

Experience working in an IT service delivery or support function OR

MBA or Degree in Information Technology or Information Security.

Experience working with Microsoft technologies.

Excellent communication skills developed working in a service centric organisation.

Ability to interpret fault descriptions provided by customers or internal escalations and translate these into resolutions.

Ability to manage and prioritise own workload.

Experience working within Education Technology would be an advantage.


Technical knowledge:

Advanced automation scripting using Terraform and Powershell. 

Knowledge of bicep and ansible advantageous. 

Advanced Microsoft Active Directory configuration and support.

Microsoft Azure and AWS cloud hosting platform administration.

Advanced Microsoft SQL server experience.

Windows Server and desktop management and configuration.

Microsoft IIS web services administration and configuration.

Advanced management of data and SQL backup solutions.

Advanced scripting and automation capabilities. 

Advanced knowledge of Azure analytics and KQL.

 

Skills & Requirements

IT Service Delivery, Information Technology, Information Security, Microsoft Technologies, Communication Skills, Fault Interpretation, Workload Prioritization, Automation Scripting, Terraform, PowerShell, Microsoft Active Directory, Microsoft Azure, AWS, Microsoft SQL Server, Windows Server, Windows Desktop Configuration, Microsoft IIS, Data Backup Management, SQL Backup Solutions, Scripting, Azure Analytics, KQL.

Read more
Adesso India

Adesso India

Agency job
via HashRoot by Deepak S
Remote only
4 - 15 yrs
₹8L - ₹25L / yr
skill iconPython
SQL
skill iconMongoDB
bigquery
skill iconJava

Overview

Adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.

Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.


Job Description

We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.


Responsibilities:

Development and maintenance of data pipelines and automation scripts with Python.

Creation of data queries and optimization of database processes with SQL.

Use of bash scripts for system administration, automation and deployment processes.

Database and cloud technologies.

Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake).

Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular.

Composer (Airflow): Orchestration of data pipelines for ETL processes.

Cloud Functions: Development of serverless functions for data processing and automation.

Cloud Scheduler: Planning and automation of recurring cloud jobs.

Cloud Secret Manager: Secure storage and management of sensitive access data and API keys.

BigQuery: Processing, analyzing and querying large amounts of data in the cloud.

Cloud Storage: Storage and management of structured and unstructured data.

Cloud monitoring: monitoring the performance and stability of cloud-based applications.

Data visualization and reporting.

Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI.


Requirements:

Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.

Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.

Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP.

Combined with cloud storage technologies, cloud monitoring and cloud secret management

Excellent communication skills to effectively collaborate with team members and stakeholders.


Nice-to-Have:

 

Knowledge of agile methodologies and working in cross-functional, collaborative teams.

Skills & Requirements

SQL, BigQuery, GCP, Python, MongoDB, Exasol, Snowflake, Bash scripting, Airflow, Cloud Functions, Cloud Scheduler, Cloud Secret Manager, Cloud Storage, Cloud Monitoring, ETL, Data Pipelines, Power BI, Database Optimization, Cloud-Based BI Solutions, Data Processing, Data Automation, Agile Methodologies, Cross-Functional Collaboration.

Read more
Adesso India
Remote only
4 - 15 yrs
₹10L - ₹27L / yr
skill iconPython
Google Cloud Platform (GCP)
SQL
skill iconMongoDB
skill iconJava

Immediate Joiners Preferred. Notice Period - Immediate to 30 Days


Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".

Only applications received via email will be reviewed. Applications through other channels will not be considered.


About Us

adesso India is a dynamic and innovative IT Services and Consulting company based in Kochi. We are committed to delivering cutting-edge solutions that make a meaningful impact on our clients. As we continue to expand our development team, we are seeking a talented and motivated Backend Developer to join us in creating scalable and high-performance backend systems.


Job Description

We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.


Responsibilities

  • Development and maintenance of data pipelines and automation scripts with Python
  • Creation of data queries and optimization of database processes with SQL
  • Use of bash scripts for system administration, automation and deployment processes
  • Database and cloud technologies
  • Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake)
  • Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular
  • Composer (Airflow): Orchestration of data pipelines for ETL processes
  • Cloud Functions: Development of serverless functions for data processing and automation
  • Cloud Scheduler: Planning and automation of recurring cloud jobs
  • Cloud Secret Manager: Secure storage and management of sensitive access data and API keys
  • BigQuery: Processing, analyzing and querying large amounts of data in the cloud
  • Cloud Storage: Storage and management of structured and unstructured data
  • Cloud monitoring: monitoring the performance and stability of cloud-based applications
  • Data visualization and reporting
  • Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI


Requirements

  • Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.
  • Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.
  • Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP
  • Combined with cloud storage technologies, cloud monitoring and cloud secret management
  • Excellent communication skills to effectively collaborate with team members and stakeholders.

Nice-to-Have:

  • Knowledge of agile methodologies and working in cross-functional, collaborative teams.
Read more
Adesso India

Adesso India

Agency job
via HashRoot by Deepak S
Remote only
3 - 11 yrs
₹6L - ₹27L / yr
Data engineering
Data architecture
skill iconAmazon Web Services (AWS)
Windows Azure
Data Transformation Tool (DBT)
+3 more

Overview

adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.

Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.


Job Description

We are seeking a skilled Cloud Data Engineer who has experience with cloud data platforms like AWS or Azure and especially Snowflake and dbt to join our dynamic team. As a consultant, you will be responsible for developing new data platforms and create the data processes. You will collaborate with cross-functional teams to design, develop, and deploy high-quality frontend solutions. 


Responsibilities:

Customer consulting: You develop data-driven products in the Snowflake Cloud and connect data & analytics with specialist departments. You develop ELT processes using dbt (data build tool) 

Specifying requirements: You develop concrete requirements for future-proof cloud data architectures.

Develop data routes: You design scalable and powerful data management processes.

Analyze data: You derive sound findings from data sets and present them in an understandable way.


Requirements:

Requirements management and project experience: You successfully implement cloud-based data & analytics projects.

Data architectures: You are proficient in DWH/data lake concepts and modeling with Data Vault 2.0.

Cloud expertise: You have extensive knowledge of Snowflake, dbt and other cloud technologies (e.g. MS Azure, AWS, GCP).

SQL know-how: You have a sound and solid knowledge of SQL.

Data management: You are familiar with topics such as master data management and data quality.

Bachelor's degree in computer science, or a related field.

Strong communication and collaboration abilities to work effectively in a team environment.

 

Skills & Requirements

Cloud Data Engineering, AWS, Azure, Snowflake, dbt, ELT processes, Data-driven consulting, Cloud data architectures, Scalable data management, Data analysis, Requirements management, Data warehousing, Data lake, Data Vault 2.0, SQL, Master data management, Data quality, GCP, Strong communication, Collaboration.

Read more
Adesso India

Adesso India

Agency job
via HashRoot by Deepak S
Remote only
5 - 12 yrs
₹10L - ₹25L / yr
J2EE
JPA
EJB
JAAS
SAML
+7 more

Overview

Adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.

Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.


Job Description

The current application landscape features multiple Java web services running on JEE application servers, primarily hosted on AWS, and integrated with various systems such as SAP, other services, and external partners. DPS is committed to delivering the best digital work experience for the customers employees and customers alike.


Responsibilities:

Independent front- and backend implementation of business functionalities, defined as user stories by the customer, considering the cost-value ratio and maintenance effort for the customer 

Implementation of user stories and incidents, including concept, implementation (including automated unit tests), and communication with the customer within the agile development process.

Database activities such as creation or modification of database schema as well as implementation of database access, queries, and data modification 

Interface realization based on standard principles like REST or SOAP

Implementation of given Identity and Access Management Patterns for securing the application

Analysis and resolution of issues (3rd Level Support).

Documentation of the implementation.

Consultancy in technical and business topics within the applications.

Usage of selected tools for implementation, testing, rollout, and support.

Participation in regular meetings with the client to track the status of assigned tasks.


Requirements:

Experience with JEE technologies such as JPA, EJB, CDI, JAAS, and SAML.

Experience in technology related to JEE, like Maven.

Proficiency in HTML5, CSS, Angular, and Bootstrap.

Strong knowledge of SQL.

Experience with web services (SOAP, REST, JSON).


Skills & Requirements

JEE, JPA, EJB, CDI, JAAS, SAML, Maven, HTML5, CSS, Angular, Bootstrap, SQL, SOAP, REST, JSON, Database schema design, Unit testing, Agile development, Identity and Access Management (IAM), Troubleshooting, Documentation, Third-level support.

Read more
Adesso India
Remote only
5 - 20 yrs
₹10L - ₹25L / yr
skill iconJava
skill iconAngular (2+)
JPA
EJB
JAAS
+9 more

Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".

Only applications received via email will be reviewed. Applications through other channels will not be considered.


Overview

Adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.

Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.


Job Description

The current application landscape features multiple Java web services running on JEE application servers, primarily hosted on AWS, and integrated with various systems such as SAP, other services, and external partners. DPS is committed to delivering the best digital work experience for the customers employees and customers alike.


Responsibilities:

Independent front- and backend implementation of business functionalities, defined as user stories by the customer, considering the cost-value ratio and maintenance effort for the customer 

Implementation of user stories and incidents, including concept, implementation (including automated unit tests), and communication with the customer within the agile development process.

Database activities such as creation or modification of database schema as well as implementation of database access, queries, and data modification 

Interface realization based on standard principles like REST or SOAP

Implementation of given Identity and Access Management Patterns for securing the application

Analysis and resolution of issues (3rd Level Support).

Documentation of the implementation.

Consultancy in technical and business topics within the applications.

Usage of selected tools for implementation, testing, rollout, and support.

Participation in regular meetings with the client to track the status of assigned tasks.


Requirements:

Experience with JEE technologies such as JPA, EJB, CDI, JAAS, and SAML.

Experience in technology related to JEE, like Maven.

Proficiency in HTML5, CSS, Angular, and Bootstrap.

Strong knowledge of SQL.

Experience with web services (SOAP, REST, JSON).


Skills & Requirements

JEE, JPA, EJB, CDI, JAAS, SAML, Maven, HTML5, CSS, Angular, Bootstrap, SQL, SOAP, REST, JSON, Database schema design, Unit testing, Agile development, Identity and Access Management (IAM), Troubleshooting, Documentation, Third-level support.



Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort