Cutshort logo
Remote sql jobs

50+ Remote SQL Jobs in India

Apply to 50+ Remote SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneSQL JobsSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
FloBiz
Agency job
via AccioJob by AccioJobHiring Board
Remote only
0 - 0 yrs
₹12L - ₹15L / yr
SQL
RESTful APIs
Object Oriented Programming (OOPs)
DSA

AccioJob is conducting a Walk-In Hiring Drive with FloBiz for the position of Backend Intern.


To apply, register and select your slot here: https://go.acciojob.com/dkfKBz


Required Skills: SQL, RestAPI, OOPs, DSA


Eligibility:

  • Degree: BTech./BE, BCA, BSc.
  • Branch: Computer Science/CSE/Other CS related branch, IT
  • Graduation Year: 2025, 2026


Work Details:

  • Work Location: (Remote)
  • CTC: ₹12 LPA to ₹15 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Skill Centres Noida, Pune, Chennai, Hyderabad, Bangalore


Further Rounds (for shortlisted candidates only):

Profile & Background Screening Round, Technical Interview Round 1, Technical Interview Round 2, Cultural Fit Round

Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/dkfKBz

Or apply in seconds — straight from our brand-new app!

https://go.acciojob.com/L6rH7C


Read more
Fountane inc
HR Fountane
Posted by HR Fountane
Remote only
5 - 9 yrs
₹18L - ₹32L / yr
skill iconAmazon Web Services (AWS)
AWS Lambda
AWS CloudFormation
ETL
skill iconDocker
+3 more

Position Overview: We are looking for an experienced and highly skilled Senior Data Engineer to join our team and help design, implement, and optimize data systems that support high-end analytical solutions for our clients. As a customer-centric Data Engineer, you will work closely with clients to understand their business needs and translate them into robust, scalable, and efficient technical solutions. You will be responsible for end-to-end data modelling, integration workflows, and data transformation processes while ensuring security, privacy, and compliance.In this role, you will also leverage the latest advancements in artificial intelligence, machine learning, and large language models (LLMs) to deliver high-impact solutions that drive business success. The ideal candidate will have a deep understanding of data infrastructure, optimization techniques, and cost-effective data management


Key Responsibilities:


• Customer Collaboration:

– Partner with clients to gather and understand their business

requirements, translating them into actionable technical specifications.

– Act as the primary technical consultant to guide clients through data challenges and deliver tailored solutions that drive value.


•Data Modeling & Integration:

– Design and implement scalable, efficient, and optimized data models to support business operations and analytical needs.

– Develop and maintain data integration workflows to seamlessly extract, transform, and load (ETL) data from various sources into data repositories.

– Ensure smooth integration between multiple data sources and platforms, including cloud and on-premise systems


• Data Processing & Optimization:

– Develop, optimize, and manage data processing pipelines to enable real-time and batch data processing at scale.

– Continuously evaluate and improve data processing performance, optimizing for throughput while minimizing infrastructure costs.


• Data Governance & Security:

–Implement and enforce data governance policies and best practices, ensuring data security, privacy, and compliance with relevant industry regulations (e.g., GDPR, HIPAA).

–Collaborate with security teams to safeguard sensitive data and maintain privacy controls across data environments.


• Cross-Functional Collaboration:

– Work closely with data engineers, data scientists, and business

analysts to ensure that the data architecture aligns with organizational objectives and delivers actionable insights.

– Foster collaboration across teams to streamline data workflows and optimize solution delivery.


• Leveraging Advanced Technologies:

– Utilize AI, machine learning models, and large language models (LLMs) to automate processes, accelerate delivery, and provide

smart, data-driven solutions to business challenges.

– Identify opportunities to apply cutting-edge technologies to improve the efficiency, speed, and quality of data processing and analytics.


• Cost Optimization:

–Proactively manage infrastructure and cloud resources to optimize throughput while minimizing operational costs.

–Make data-driven recommendations to reduce infrastructure overhead and increase efficiency without sacrificing performance.


Qualifications:


• Experience:

– Proven experience (5+ years) as a Data Engineer or similar role, designing and implementing data solutions at scale.

– Strong expertise in data modelling, data integration (ETL), and data transformation processes.

– Experience with cloud platforms (AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark).


• Technical Skills:

– Advanced proficiency in SQL, data modelling tools (e.g., Erwin,PowerDesigner), and data integration frameworks (e.g., Apache

NiFi, Talend).

– Strong understanding of data security protocols, privacy regulations, and compliance requirements.

– Experience with data storage solutions (e.g., data lakes, data warehouses, NoSQL, relational databases).


• AI & Machine Learning Exposure:

– Familiarity with leveraging AI and machine learning technologies (e.g., TensorFlow, PyTorch, scikit-learn) to optimize data processing and analytical tasks.

–Ability to apply advanced algorithms and automation techniques to improve business processes.


• Soft Skills:

– Excellent communication skills to collaborate with clients, stakeholders, and cross-functional teams.

– Strong problem-solving ability with a customer-centric approach to solution design.

– Ability to translate complex technical concepts into clear, understandable terms for non-technical audiences.


• Education:

– Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or a related field (or equivalent practical experience).


LIFE AT FOUNTANE:

  • Fountane offers an environment where all members are supported, challenged, recognized & given opportunities to grow to their fullest potential.
  • Competitive pay
  • Health insurance for spouses, kids, and parents.
  • PF/ESI or equivalent
  • Individual/team bonuses
  • Employee stock ownership plan
  • Fun/challenging variety of projects/industries
  • Flexible workplace policy - remote/physical
  • Flat organization - no micromanagement
  • Individual contribution - set your deadlines
  • Above all - culture that helps you grow exponentially!


A LITTLE BIT ABOUT THE COMPANY:

Established in 2017, Fountane Inc is a Ventures Lab incubating and investing in new competitive technology businesses from scratch. Thus far, we’ve created half a dozen multi-million valuation companies in the US and a handful of sister ventures for large corporations, including Target, US Ventures, and Imprint Engine.

We’re a team of 120+ strong from around the world that are radically open-minded and believes in excellence, respecting one another, and pushing our boundaries to the furthest it's ever been.

Read more
KGISL MICROCOLLEGE
Agency job
via EDU TECH by Srimathi Balamurugan
Remote, Kochi (Cochin)
1 - 5 yrs
₹2L - ₹6L / yr
Business Analysis
SQL
MS-Excel
Tableau
PowerBI

We are looking for a passionate and experienced Business Analyst Trainer to join our training team. This role involves delivering high-quality training programs on business analysis tools, methodologies, and best practices, both in-person and online.

Read more
Hypersonix Inc

at Hypersonix Inc

2 candid answers
1 product
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote only
7yrs+
Upto ₹30L / yr (Varies
)
skill iconData Analytics
SQL
MS-Excel
skill iconPython
skill iconR Programming
+5 more

About the Role:

We are looking for a Senior Technical Customer Success Manager to join our growing team. This is a client-facing role focused on ensuring successful adoption and value realization of our SaaS solutions. The ideal candidate will come from a strong analytics background, possess hands-on skills in SQL and Python or R, and have experience working with dashboarding tools. Prior experience in eCommerce or retail domains is a strong plus.


Responsibilities:

  • Own post-sale customer relationship and act as the primary technical point of contact.
  • Drive product adoption and usage through effective onboarding, training, and ongoing support.
  • Work closely with clients to understand business goals and align them with product capabilities.
  • Collaborate with internal product, engineering, and data teams to deliver solutions and enhancements tailored to client needs.
  • Analyze customer data and usage trends to proactively identify opportunities and risks.
  • Build dashboards or reports for customers using internal tools or integrations.
  • Lead business reviews, share insights, and communicate value delivered.
  • Support customers in configuring rules, data integrations, and troubleshooting issues.
  • Drive renewal and expansion by ensuring customer satisfaction and delivering measurable outcomes.


Requirements:

  • 7+ years of experience in a Customer Success, Technical Account Management, or Solution Consulting role in a SaaS or software product company.
  • Strong SQL skills and working experience with Python or R.
  • Experience with dashboarding tools such as Tableau, Power BI, Looker, or similar.
  • Understanding of data pipelines, APIs, and data modeling.
  • Excellent communication and stakeholder management skills.
  • Proven track record of managing mid to large enterprise clients.
  • Experience in eCommerce, retail, or consumer-facing businesses is highly desirable.
  • Ability to translate technical details into business context and vice versa.
  • Bachelor’s or Master’s degree in Computer Science, Analytics, Engineering, or related field.


Nice to Have:

  • Exposure to machine learning workflows, recommendation systems, or pricing analytics.
  • Familiarity with cloud platforms (AWS/GCP/Azure).
  • Experience working with cross-functional teams in Agile environments.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Remote only
10 - 15 yrs
₹10L - ₹18L / yr
Solution architecture
Denodo
Data Virtualization
Data architecture
SQL
+5 more

Job Title : Solution Architect – Denodo

Experience : 10+ Years

Location : Remote / Work from Home

Notice Period : Immediate joiners preferred


Job Overview :

We are looking for an experienced Solution Architect – Denodo to lead the design and implementation of data virtualization solutions. In this role, you will work closely with cross-functional teams to ensure our data architecture aligns with strategic business goals. The ideal candidate will bring deep expertise in Denodo, strong technical leadership, and a passion for driving data-driven decisions.


Mandatory Skills : Denodo, Data Virtualization, Data Architecture, SQL, Data Modeling, ETL, Data Integration, Performance Optimization, Communication Skills.


Key Responsibilities :

  • Architect and design scalable data virtualization solutions using Denodo.
  • Collaborate with business analysts and engineering teams to understand requirements and define technical specifications.
  • Ensure adherence to best practices in data governance, performance, and security.
  • Integrate Denodo with diverse data sources and optimize system performance.
  • Mentor and train team members on Denodo platform capabilities.
  • Lead tool evaluations and recommend suitable data integration technologies.
  • Stay updated with emerging trends in data virtualization and integration.

Required Qualifications :

  • Bachelor’s degree in Computer Science, IT, or a related field.
  • 10+ Years of experience in data architecture and integration.
  • Proven expertise in Denodo and data virtualization frameworks.
  • Strong proficiency in SQL and data modeling.
  • Hands-on experience with ETL processes and data integration tools.
  • Excellent communication, presentation, and stakeholder management skills.
  • Ability to lead technical discussions and influence architectural decisions.
  • Denodo or data architecture certifications are a strong plus.
Read more
Remote only
4 - 6 yrs
₹10L - ₹15L / yr
skill iconAngular (2+)
skill icon.NET
SQL
Relational Database (RDBMS)
Dependency injection

.NET + Angular Full Stack Developer (4–5 Years Experience)

Location: Pune/Remote

Experience Required: 4 to 5 years

Communication: Fluent English (verbal & written)

Technology: .NET, Angular

Only immediate joiners who can start on 21st July should apply.


Job Overview

We are seeking a skilled and experienced Full Stack Developer with strong expertise in .NET (C#) and Angular to join our dynamic team in Pune. The ideal candidate will have hands-on experience across the full development stack, a strong understanding of relational databases and SQL, and the ability to work independently with clients. Experience in microservices architecture is a plus.


Key Responsibilities

  • Design, develop, and maintain modern web applications using .NET Core / .NET Framework and Angular
  • Write clean, scalable, and maintainable code for both backend and frontend components
  • Interact directly with clients for requirement gathering, demos, sprint planning, and issue resolution
  • Work closely with designers, QA, and other developers to ensure high-quality product delivery
  • Perform regular code reviews, ensure adherence to coding standards, and mentor junior developers if needed
  • Troubleshoot and debug application issues and provide timely solutions
  • Participate in discussions on architecture, design patterns, and technical best practices

Must-Have Skills

✅ Strong hands-on experience with .NET Core / .NET Framework (Web API, MVC)

✅ Proficiency in Angular (Component-based architecture, RxJS, State Management)

✅ Solid understanding of RDBMS and SQL (preferably with SQL Server)

✅ Familiarity with Entity Framework or Dapper

✅ Strong knowledge of RESTful API design and integration

✅ Version control using Git

✅ Excellent verbal and written communication skills

✅ Ability to work in a client-facing role and handle discussions independently

Good-to-Have / Optional Skills

Understanding or experience in Microservices Architecture

Exposure to CI/CD pipelines, unit testing frameworks, and cloud environments (e.g., Azure or AWS)

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shrutika SaileshKumar
Posted by Shrutika SaileshKumar
Remote, Bengaluru (Bangalore)
5 - 9 yrs
Best in industry
skill iconPython
SDET
BDD
SQL
Data Warehouse (DWH)
+2 more

Primary skill set: QA Automation, Python, BDD, SQL 

As Senior Data Quality Engineer you will:

  • Evaluate product functionality and create test strategies and test cases to assess product quality.
  • Work closely with the on-shore and the offshore team.
  • Work on multiple reports validation against the databases by running medium to complex SQL queries.
  • Better understanding of Automation Objects and Integrations across various platforms/applications etc.
  • Individual contributor exploring opportunities to improve performance and suggest/articulate the areas of improvements importance/advantages to management.
  • Integrate with SCM infrastructure to establish a continuous build and test cycle using CICD tools.
  • Comfortable working on Linux/Windows environment(s) and Hybrid infrastructure models hosted on Cloud platforms.
  • Establish processes and tools set to maintain automation scripts and generate regular test reports.
  • Peer review to provide feedback and to make sure the test scripts are flaw-less.

Core/Must have skills:

  • Excellent understanding and hands on experience in ETL/DWH testing preferably DataBricks paired with Python experience.
  • Hands on experience SQL (Analytical Functions and complex queries) along with knowledge of using SQL client utilities effectively.
  • Clear & crisp communication and commitment towards deliverables
  • Experience on BigData Testing will be an added advantage.
  • Knowledge on Spark and Scala, Hive/Impala, Python will be an added advantage.

Good to have skills:

  • Test automation using BDD/Cucumber / TestNG combined with strong hands-on experience with Java with Selenium. Especially working experience in WebDriver.IO
  • Ability to effectively articulate technical challenges and solutions
  • Work experience in qTest, Jira, WebDriver.IO


Read more
Hypersonix Inc

at Hypersonix Inc

2 candid answers
1 product
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote only
7yrs+
Upto ₹40L / yr (Varies
)
SQL
skill iconPython
ETL
Data engineering
Big Data
+2 more

About the Company

Hypersonix.ai is disrupting the e-commerce space with AI, ML and advanced decision capabilities to drive real-time business insights. Hypersonix.ai has been built ground up with new age technology to simplify the consumption of data for our customers in various industry verticals. Hypersonix.ai is seeking a well-rounded, hands-on product leader to help lead product management of key capabilities and features.


About the Role

We are looking for talented and driven Data Engineers at various levels to work with customers to build the data warehouse, analytical dashboards and ML capabilities as per customer needs.


Roles and Responsibilities

  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements; should write complex queries in an optimized way
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
  • Run ad-hoc analysis utilizing the data pipeline to provide actionable insights
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions
  • Work with analytics and data scientist team members and assist them in building and optimizing our product into an innovative industry leader


Requirements

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • A successful history of manipulating, processing and extracting value from large disconnected datasets
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
  • Experience supporting and working with cross-functional teams in a dynamic environment
  • We are looking for a candidate with 7+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Information Technology or completed MCA.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Remote only
5 - 10 yrs
₹10L - ₹22L / yr
Business Analysis
Healthcare
Requirements management
User stories
Gap analysis
+11 more

Position : Business Analyst

Experience : 5+ Years

Location : Remote

Notice Period : Immediate Joiners Preferred (or candidates serving 10–15 days’ notice)

Interview Mode : Virtual


Job Description :

We are seeking an experienced Business Analyst with a strong background in requirements gathering, functional documentation, and stakeholder management, particularly in the US Healthcare payer domain.


Mandatory Skills :

Business Analysis, US Healthcare Payer Domain, Requirement Gathering, User Stories, Gap & Impact Analysis, Azure DevOps/TFS, SQL, UML Modeling, SDLC/STLC, System Testing, UAT, Strong Communication Skills.


Key Responsibilities :

  • Analyze and understand complex business and functional requirements.
  • Translate business needs into detailed User Stories, functional and technical specifications.
  • Conduct gap analysis and impact assessment for new and existing product features.
  • Create detailed documentation including scope, project plans, and secure stakeholder approvals.
  • Support System Testing and User Acceptance Testing (UAT) from a functional perspective.
  • Prepare and maintain release notes, end-user documentation, training materials, and process flows.
  • Serve as a liaison between business and technical teams, ensuring cross-functional alignment.
  • Assist with sprint planning, user story tracking, and status updates using Azure DevOps / TFS.
  • Write and execute basic SQL queries for data validation and analysis.

Required Skills :

  • Minimum 5 years of experience as a Business Analyst.
  • Strong analytical, problem-solving, and communication skills.
  • Solid understanding of Project Life Cycle, STLC, and UML modeling.
  • Prior experience in US Healthcare payer domain is mandatory.
  • Familiarity with tools like Azure DevOps / TFS.
  • Ability to work with urgency, manage priorities, and maintain attention to detail.
  • Strong team collaboration and stakeholder management.
Read more
GroundTruth
Laxmi Pal
Posted by Laxmi Pal
Remote only
4 - 6 yrs
₹14L - ₹16L / yr
skill iconPython
SQL
skill iconData Analytics
Data Structures
skill iconAmazon Web Services (AWS)
+4 more

Role Characteristics:

Analytics team provides analytical support to multiple stakeholders (Product, Engineering, Business development, Ad operations) by developing scalable analytical solutions, identifying problems, coming up with KPIs and monitor those to measure impact/success of product improvements/changes and streamlining processes. This will be an exciting and challenging role that will enable you to work with large data sets, expose you to cutting edge analytical techniques, work with latest AWS analytics infrastructure (Redshift, s3, Athena, and gain experience in the usage of location data to drive businesses. Working in a dynamic start up environment will give you significant opportunities for growth within the organization. A successful applicant will be passionate about technology and developing a deep understanding of human behavior in the real world. They would also have excellent communication skills, be able to synthesize and present complex information and be a fast learner.


You Will:

  • Perform root cause analysis with minimum guidance to figure out reasons for sudden changes/abnormalities in metrics
  • Understand objective/business context of various tasks and seek clarity by collaborating with different stakeholders (like Product, Engineering
  • Derive insights and putting them together to build a story to solve a given problem 
  • Suggest ways for process improvements in terms of script optimization, automating repetitive tasks 
  • Create and automate reports and dashboards through Python to track certain metrics basis given requirements 
  • Automate reports and dashboards through Python


Technical Skills (Must have)

  • B.Tech degree in Computer Science, Statistics, Mathematics, Economics or related fields
  •  4-6 years of experience in working with data and conducting statistical and/or numerical analysis
  •  Ability to write SQL code
  • Scripting/automation using python
  •  Hands on experience in data visualisation tool like Looker/Tableau/Quicksight
  • Basic to advance level understanding of statistics


Other Skills (Must have)

  • Be willing and able to quickly learn about new businesses, database technologies and analysis techniques
  • Strong oral and written communication 
  • Understanding of patterns/trends and draw insights from those Preferred Qualifications (Nice to have) 
  • Experience working with large datasets
  • Experience with AWS analytics infrastructure (Redshift, S3, Athena, Boto3)
  • Hands on experience on AWS services like lambda, step functions, Glue, EMR + exposure to pyspark 


What we offer

At GroundTruth, we want our employees to be comfortable with their benefits so they can focus on doing the work they love.

  • Parental leave- Maternity and Paternity
  • Flexible Time Offs (Earned Leaves, Sick Leaves, Birthday leave, Bereavement leave & Company Holidays)
  • In Office Daily Catered Lunch
  • Fully stocked snacks/beverages
  • Health cover for any hospitalization. Covers both nuclear family and parents
  • Tele-med for free doctor consultation, discounts on health checkups and medicines
  • Wellness/Gym Reimbursement
  • Pet Expense Reimbursement
  • Childcare Expenses and reimbursements
  • Employee assistance program
  • Employee referral program
  • Education reimbursement program
  • Skill development program
  • Cell phone reimbursement (Mobile Subsidy program)
  • Internet reimbursement
  • Birthday treat reimbursement
  • Employee Provident Fund Scheme offering different tax saving options such as VPF and employee and employer contribution up to 12% Basic
  • Creche reimbursement
  • Co-working space reimbursement
  • NPS employer match
  • Meal card for tax benefit
  • Special benefits on salary account


We are an equal opportunity employer and value diversity, inclusion and equity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

Read more
Parksmart
Agency job
via Parksmart by Saurav Kumar
Remote, Noida
0 - 1 yrs
₹10000 - ₹15000 / mo
skill iconNodeJS (Node.js)
skill iconAmazon Web Services (AWS)
skill iconReact.js
SQL
skill iconMongoDB
+1 more


🚀 We're Urgently Hiring – Node.js Backend Development Intern

Join our backend team as an intern and get hands-on experience building scalable, real-world applications with Node.js, Firebase, and AWS.

📍 Remote / Onsite

📍 📅 Duration: 2 Months


🔧 What You’ll Work On:

Backend development using Node.js

Firebase, SQL & NoSQL database management

RESTful API integration

Deployment on AWS infrastructure


Read more
Deltek
shwetha V
Posted by shwetha V
Remote only
4 - 7 yrs
Best in industry
Product support
Technical support
SQL
Customer Support


Support Services Analyst

Company Summary :

As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com


Business Summary :

Deltek’s award winning Support Services team provides best-in-class assistance to Deltek’s customers across the world via phone, chat and email. Our team is comprised of a group of diverse, collaborative and passionate professionals who come from varying industries, backgrounds and professions. Our diversity and passion is our strength, so however you identify and whatever background you bring, we invite you to explore our team as a potential next step in your career!


External Job Title :

Support Services Analyst

Position Responsibilities :

  • Serves as the second level of support to all the customers’ queries. 
  • Resolves the queries and issues by ensuring that all assigned requests are addressed within the SLA or escalate it further as required. 
  • Take ownership and responsibility of issues from start through to a successful resolution and follow the escalation process, to speed up the resolution. 
  • Effective & efficient working in partnership with other departments to prevent delay in resolution 
  • Applying technology in multiple ways to configure the product and helping the customer implement Replicons products. 
  • Solid understanding of product limits and suggesting ways of improving the product 
  • Logically understanding the concepts of other SaaS based products for integration requests. 
  • Need to be multi skilled in all three mediums (phone. Chats and emails.) 

Qualifications :

  • Any Bachelor's Degree 
  • At least 2 years of experience in software application support and/or infrastructure support 
  • Basic understanding of Web technology, basic networking & hardware knowledge, and software applications 
  • Excellent communication skills - verbal, written, listening skills and interpersonal skills. 
  • Ability to communicate in a tactful, courteous manner and to deal with and resolve complex situations in a professional manner 
  • Ability to handle multiple tasks/projects simultaneously and effectively work individually or in a team environment 
  • Open to work in a 24/7 support environment 


Read more
FiftyFive Technologies Pvt Ltd
Remote, Indore, Jaipur, Gurugram
5 - 10 yrs
₹10L - ₹25L / yr
skill iconNodeJS (Node.js)
skill iconReact.js
GraphQL
skill iconMongoDB
SQL
+2 more

Job Overview:


We are looking for a Full-Stack Developer with 4+ years of experience in software development. The ideal candidate will be proficient in both frontend and backend technologies, capable of building scalable and high-performance applications, and have a problem-solving mindset. You will collaborate with cross-functional teams to develop, optimize, and maintain web applications.

 

Key Responsibilities :

 

- Design, develop, and maintain web applications ensuring performance and scalability.  

- Work with backend services using Node.js (Express.js/NestJS) and databases.  

- Develop and maintain frontend applications using React.js (minimum 1 year experience required).  

- Integrate APIs (RESTful & GraphQL) and third-party services.  

- Write clean, maintainable, and efficient code following industry best practices.  

- Ensure security, reliability, and optimization in applications.  

- Participate in debugging, troubleshooting, and performance tuning.  

- Work closely with designers, product managers, and engineers to deliver high-quality solutions.  

- Stay updated with modern development trends and contribute to technical improvements.  

 

Required Skills & Qualifications :  


- 4+ years of experience in full-stack development.  

- Strong proficiency in JavaScript and TypeScript.  

- Hands-on experience with Node.js (Express.js/NestJS).  

- Minimum 1 year of experience working with React.js.  

- Experience with SQL (PostgreSQL, MySQL) and NoSQL (MongoDB) databases.  

- Proficiency in API design, development, and integration (RESTful, GraphQL).  

- Familiarity with version control tools (Git, GitHub/GitLab/Bitbucket).  

- Strong problem-solving and analytical skills.  

- Ability to work both independently and collaboratively in a team.  

 

Good to Have : 


- Experience with Cloud Services (AWS, Azure, or GCP).  

- Familiarity with Containerization (Docker, Kubernetes).  

- Knowledge of testing frameworks (Jest, Mocha, or Cypress).  

- Understanding of event-driven architectures and message queues (Kafka, RabbitMQ).

Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Remote, Kochi (Cochin), Trivandrum
8 - 15 yrs
₹10L - ₹24L / yr
skill iconJava
skill iconSpring Boot
skill iconPython
skill iconAngular (2+)
skill iconAmazon Web Services (AWS)
+7 more

Job Title : Technical Architect

Experience : 8 to 12+ Years

Location : Trivandrum / Kochi / Remote

Work Mode : Remote flexibility available

Notice Period : Immediate to max 15 days (30 days with negotiation possible)


Summary :

We are looking for a highly skilled Technical Architect with expertise in Java Full Stack development, cloud architecture, and modern frontend frameworks (Angular). This is a client-facing, hands-on leadership role, ideal for technologists who enjoy designing scalable, high-performance, cloud-native enterprise solutions.


🛠 Key Responsibilities :

  • Architect scalable and high-performance enterprise applications.
  • Hands-on involvement in system design, development, and deployment.
  • Guide and mentor development teams in architecture and best practices.
  • Collaborate with stakeholders and clients to gather and refine requirements.
  • Evaluate tools, processes, and drive strategic technical decisions.
  • Design microservices-based solutions deployed over cloud platforms (AWS/Azure/GCP).

Mandatory Skills :

  • Backend : Java, Spring Boot, Python
  • Frontend : Angular (at least 2 years of recent hands-on experience)
  • Cloud : AWS / Azure / GCP
  • Architecture : Microservices, EAI, MVC, Enterprise Design Patterns
  • Data : SQL / NoSQL, Data Modeling
  • Other : Client handling, team mentoring, strong communication skills

Nice to Have Skills :

  • Mobile technologies (Native / Hybrid / Cross-platform)
  • DevOps & Docker-based deployment
  • Application Security (OWASP, PCI DSS)
  • TOGAF familiarity
  • Test-Driven Development (TDD)
  • Analytics / BI / ML / AI exposure
  • Domain knowledge in Financial Services or Payments
  • 3rd-party integration tools (e.g., MuleSoft, BizTalk)

⚠️ Important Notes :

  • Only candidates from outside Hyderabad/Telangana and non-JNTU graduates will be considered.
  • Candidates must be serving notice or joinable within 30 days.
  • Client-facing experience is mandatory.
  • Java Full Stack candidates are highly preferred.

🧭 Interview Process :

  1. Technical Assessment
  2. Two Rounds – Technical Interviews
  3. Final Round
Read more
Hunarstreet Technologies pvt ltd

Hunarstreet Technologies pvt ltd

Agency job
via Hunarstreet Technologies pvt ltd by Sakshi Patankar
Remote only
10 - 20 yrs
₹15L - ₹30L / yr
Data engineering
databricks
skill iconPython
skill iconScala
Spark
+14 more

What You’ll Be Doing:

● Design and build parts of our data pipeline architecture for extraction, transformation, and loading of data from a wide variety of data sources using the latest Big Data technologies.

● Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

● Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

● Work with machine learning, data, and analytics experts to drive innovation, accuracy and greater functionality in our data system. Qualifications:

● Bachelor's degree in Engineering, Computer Science, or relevant field.

● 10+ years of relevant and recent experience in a Data Engineer role. ● 5+ years recent experience with Apache Spark and solid understanding of the fundamentals.

● Deep understanding of Big Data concepts and distributed systems.

● Strong coding skills with Scala, Python, Java and/or other languages and the ability to quickly switch between them with ease.

● Advanced working SQL knowledge and experience working with a variety of relational databases such as Postgres and/or MySQL.

● Cloud Experience with DataBricks

● Experience working with data stored in many formats including Delta Tables, Parquet, CSV and JSON.

● Comfortable working in a linux shell environment and writing scripts as needed.

● Comfortable working in an Agile environment

● Machine Learning knowledge is a plus.

● Must be capable of working independently and delivering stable, efficient and reliable software.

● Excellent written and verbal communication skills in English.

● Experience supporting and working with cross-functional teams in a dynamic environment


EMPLOYMENT TYPE: Full-Time, Permanent

LOCATION: Remote (Pan India)

SHIFT TIMINGS: 2.00 pm-11:00pm IST 

Read more
Automate Accounts

at Automate Accounts

2 candid answers
Namrata Das
Posted by Namrata Das
Remote only
2 - 5 yrs
₹6L - ₹15L / yr
skill iconPython
SQL
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconGitHub
+2 more

Responsibilities


Develop and maintain web and backend components using Python, Node.js, and Zoho tools


Design and implement custom workflows and automations in Zoho


Perform code reviews to maintain quality standards and best practices


Debug and resolve technical issues promptly


Collaborate with teams to gather and analyze requirements for effective solutions


Write clean, maintainable, and well-documented code


Manage and optimize databases to support changing business needs


Contribute individually while mentoring and supporting team members


Adapt quickly to a fast-paced environment and meet expectations within the first month



Leadership Opportunities


Lead and mentor junior developers in the team


Drive projects independently while collaborating with the broader team


Act as a technical liaison between the team and stakeholders to deliver effective solutions



Selection Process


1. HR Screening: Review of qualifications and experience


2. Online Technical Assessment: Test coding and problem-solving skills


3. Technical Interview: Assess expertise in web development, Python, Node.js, APIs, and Zoho


4. Leadership Evaluation: Evaluate team collaboration and leadership abilities


5. Management Interview: Discuss cultural fit and career opportunities


6. Offer Discussion: Finalize compensation and role specifics



Experience Required


2–5 years of relevant experience as a Software Developer


Proven ability to work as a self-starter and contribute individually


Strong technical and interpersonal skills to support team members effectively


Read more
BigRio
Disha Bhardwaj
Posted by Disha Bhardwaj
Remote only
7 - 14 yrs
₹14L - ₹15L / yr
Documentation
Scripting
testing
skill iconJavascript
SQL
+2 more

Solution Engineer                                                             


 

Primary Responsibilities

●         Serve as the primary resource during the client implementation/onboarding phase

 

●         Identify, document, and define customer business and technical needs

 

●         Develop clear user documentation, instructions, and standard procedures

 

●         Deliver training sessions on solution administration and usage

 

●         Participate in customer project calls and serve as a subject matter expert on solutions

 

●         Coordinate tasks across internal and client project teams, ensuring accountability and progress tracking

 

●         Perform hands-on configuration, scripting, data imports, testing, and knowledge transfer activities

 

●         Translate business requirements into technical specifications for product configuration or enhancements

 

●         Collaborate with global team members across multiple time zones, including the U.S., India, and China

 

●         Build and maintain strong customer relationships to gather and validate requirements


●         Contribute to the development of implementation best practices and suggest improvements to processes

 

●         Execute other tasks and duties as assigned

 

 

Note: Salary offered will depend on the candidate's qualifications and experience.




 

Required Skills & Experience

●         Proven experience leading software implementation projects from presales through delivery

 

●         Strong organizational skills with the ability to manage multiple detailed and interdependent tasks

 

●         2–5 years of experience in JavaScript and web development, including prior implementation work in a software company

 

●         Proficiency in some or all of the following:

 

○         JavaScript, PascalScript, MS SQL Script, RESTful APIs, Azure, Postman

 

○         Embarcadero RAD Studio, Delphi

 

○         Basic SQL and debugging

 

○         SMS integration and business intelligence tools

 

●         General knowledge of database structures and data migration processes

 

●         Familiarity with project management tools and methodologies

 

●         Strong interpersonal skills with a focus on client satisfaction and relationship-building

 

●         Self-starter with the ability to work productively in a remote, distributed team environment

 

●         Experience in energy efficiency retrofits, construction, or utility demand-side management is a plus

Read more
BigRio
Disha Bhardwaj
Posted by Disha Bhardwaj
Remote only
12 - 17 yrs
₹35L - ₹45L / yr
skill iconC#
SQL
skill iconJava
Microsoft Windows Azure

Responsibilities:


●        Technical Leadership:                                                                                                     

○        Architect and design complex software systems

○        Lead the development team in implementing software solutions

○        Ensure adherence to coding standards and best practices

○        Conduct code reviews and provide constructive feedback

○        Troubleshoot and resolve technical issues

●        Project Management:                                                                                                       

○        Collaborate with project managers to define project scope and requirements

○        Estimate project timelines and resource needs

○        Track project progress and ensure timely delivery

○        Manage risks and identify mitigation strategies

●        Team Development:                                                                                                         

○        Mentor and coach junior developers

○        Foster a collaborative and supportive team environment

○        Conduct performance evaluations and provide feedback

○        Identify training and development opportunities for team members

●        Innovation:                                                                                                                          

○        Stay abreast of emerging technologies and industry trends

○        Evaluate and recommend new technologies for adoption

○        Encourage experimentation and innovation within the team

 

Qualifications                                                                                                                                 

 

●        Experience:                                                                                                                        

○        12+ years of experience in software development

○        4+ years of experience in a leadership role

○        Proven track record of delivering successful software projects

●        Skills:                                                                                                                                    

○        Strong proficiency in C# programming languages


○        Good knowledge on Java for reporting

○        Strong on SQL - Microsoft azure

○        Expertise in software development methodologies (e.g., Agile, Scrum)

○        Excellent problem-solving and analytical skills

○        Strong communication and interpersonal skills

○        Ability to work independently and as part of a team

 

                                                                                                   


Read more
RockED
Kashish Trehan
Posted by Kashish Trehan
Remote only
4 - 9 yrs
₹15L - ₹40L / yr
skill iconNodeJS (Node.js)
MySQL
skill iconJavascript
SQL
skill iconExpress
+3 more

Your Impact

  • Build scalable backend services.
  • Design, implement, and maintain databases, ensuring data integrity, security, and efficient retrieval.
  • Implement the core logic that makes applications work, handling data processing, user requests, and system operations
  • Contribute to the architecture and design of new product features
  • Optimize systems for performance, scalability, and security
  • Stay up-to-date with new technologies and frameworks, contributing to the advancement of software development practices
  • Working closely with product managers and designers to turn ideas into reality and shape the product roadmap.

What skills do you need?


  • 4+ years of experience in backend development, especially building robust APIS using Node.js, Express.js, MYSQL
  • Strong command of JavaScript and understanding of its quirks and best practices
  • Ability to think strategically when designing systems—not just how to build, but why
  • Exposure to system design and interest in building scalable, high-availability systems
  • Prior work on B2C applications with a focus on performance and user experience
  • Ensure that applications can handle increasing loads and maintain performance, even under heavy traffic
  • Work with complex queries for performing sophisticated data manipulation, analysis, and reporting.
  • Knowledge of Sequelize, MongoDB and AWS would be an advantage.
  • Experience in optimizing backend systems for speed and scalability.


Read more
Astegic

at Astegic

3 recruiters
Agency job
via Hunarstreet Technologies pvt ltd by Priyanka Londhe
Remote only
10 - 13 yrs
₹30L - ₹50L / yr
skill iconScala
Apache Spark
Big Data
skill iconPython
skill iconJava
+3 more

POSITION:

Senior Data Engineer

The Senior Data Engineer will be responsible for building and extending our data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys working with big data and building systems from the ground up.

You will collaborate with our software engineers, database architects, data analysts and data scientists to ensure our data delivery architecture is consistent throughout the platform. You must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.


What You’ll Be Doing:

● Design and build parts of our data pipeline architecture for extraction, transformation, and loading of data from a wide variety of data sources using the latest Big Data technologies.

● Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

● Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

● Work with machine learning, data, and analytics experts to drive innovation, accuracy and greater functionality in our data system.

Qualifications:

● Bachelor's degree in Engineering, Computer Science, or relevant

field.

● 10+ years of relevant and recent experience in a Data Engineer role.

● 5+ years recent experience with Apache Spark and solid understanding of the fundamentals.

● Deep understanding of Big Data concepts and distributed systems.

● Strong coding skills with Scala, Python, Java and/or other languages and the ability to quickly switch between them with ease.

● Advanced working SQL knowledge and experience working with a variety of relational databases such as Postgres and/or MySQL.

● Cloud Experience with DataBricks

● Experience working with data stored in many formats including Delta Tables, Parquet, CSV and JSON.

● Comfortable working in a linux shell environment and writing scripts as needed.

● Comfortable working in an Agile environment

● Machine Learning knowledge is a plus.

● Must be capable of working independently and delivering stable,

efficient and reliable software.

● Excellent written and verbal communication skills in English.

● Experience supporting and working with cross-functional teams in a dynamic environment.

REPORTING: This position will report to our CEO or any other Lead as assigned by Management.

EMPLOYMENT TYPE: Full-Time, Permanent LOCATION: Remote (Pan India) SHIFT TIMINGS: 2.00 pm-11:00pm IST

WHO WE ARE:

SalesIntel is the top revenue intelligence platform on the market. Our combination of automation and researchers allows us to reach 95% data accuracy for all our published contact data, while continuing to scale up our number of contacts. We currently have more than 5 million human-verifi ed contacts, another 70 million plus machine processed contacts, and the highest number of direct dial contacts in the industry. We guarantee our accuracy with our well-trained research team that re-verifi es every direct dial number, email, and contact every 90 days. With the most comprehensive contact and company data and our excellent customer service, SalesIntel has the best B2B data available. For more information, please visit – www.salesintel.io

WHAT WE OFFER: SalesIntel’s workplace is all about diversity. Different countries and cultures are represented in our workforce. We are growing at a fast pace and our work environment is constantly evolving with changing times. We motivate our team to better themselves by offering all the good stuff you’d expect like Holidays, Paid Leaves, Bonuses, Incentives, Medical Policy and company paid Training Programs.

SalesIntel is an Equal Opportunity Employer. We prohibit discrimination and harassment of any type and offer equal employment opportunities to employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law.

Read more
Remote only
4 - 10 yrs
₹20L - ₹30L / yr
Artificial Intelligence (AI)
Large Language Models (LLM) tuning
skill iconPython
SQL
Retrieval Augmented Generation (RAG)
+10 more

Knowledge of Gen AI technology ecosystem including top tier LLMs, prompt engineering, knowledge of development frameworks such as LLMaxindex and LangChain, LLM fine tuning and experience in architecting RAGs and other LLM based solution for enterprise use cases. 1. Strong proficiency in programming languages like Python and SQL. 2. 3+ years of experience of predictive/prescriptive analytics including Machine Learning algorithms (Supervised and Unsupervised) and deep learning algorithms and Artificial Neural Networks such as Regression , classification, ensemble model,RNN,LSTM,GRU. 3. 2+ years of experience in NLP, Text analytics, Document AI, OCR, sentiment analysis, entity recognition, topic modeling 4. Proficiency in LangChain and Open LLM frameworks to perform summarization, classification, Name entity recognition, Question answering 5. Proficiency in Generative techniques prompt engineering, Vector DB, LLMs such as OpenAI,LlamaIndex, Azure OpenAI, Open-source LLMs will be important 6. Hands-on experience in GenAI technology areas including RAG architecture, fine tuning techniques, inferencing frameworks etc 7. Familiarity with big data technologies/frameworks 8. Sound knowledge of Microsoft Azure

Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Remote only
10 - 12 yrs
₹10L - ₹20L / yr
Symfony
skill iconPHP
SQL
skill iconAmazon Web Services (AWS)
skill iconJavascript

Profile: Senior PHP Developer

Experience- 10+Years

Mode: Remote

Required Skills:

  • PHP (10+ years) & Symfony framework (5+ years)
  • Team leadership experience (3+ years)
  • OOP, design patterns, RESTful APIs
  • Database optimization (MySQL/PostgreSQL)
  • Git, CI/CD, testing frameworks
  • Excellent communication skills

Responsibilities:

  • Lead PHP/Symfony development
  • Mentor team members
  • Ensure code quality through reviews
  • Collaborate with stakeholders
  • Manage sprint cycles
  • Optimize application performance


Read more
Quanteon Solutions
DurgaPrasad Sannamuri
Posted by DurgaPrasad Sannamuri
Remote, Hyderabad
5 - 8 yrs
₹25L - ₹35L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconJavascript
SQL
skill iconMongoDB
+5 more

We are looking for a highly skilled Senior Software Engineer with over 5 years of experience in full stack development using React.js and Node.js. As a senior member of our engineering team, you’ll take ownership of complex technical challenges, influence architecture decisions, mentor junior developers, and contribute to high-impact products.


Key Responsibilities:

Design, build, and maintain scalable web applications using React.js (frontend) and Node.js (backend).

Architect robust, secure, and scalable backend APIs and frontend components.

Collaborate closely with Product Managers, Designers, and DevOps to deliver end-to-end features.

Conduct code reviews, enforce best practices, and guide junior developers.

Optimize application performance, scalability, and responsiveness.

Troubleshoot, debug, and upgrade existing systems.

Stay current with new technologies and advocate for continuous improvement.


Required Qualifications:

Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.

5+ years of experience in full stack development.

Strong expertise in React.js and related libraries (Redux, Hooks, etc.).

In-depth experience with Node.js, Express.js, and RESTful APIs.

Proficiency with JavaScript/TypeScript and modern frontend tooling (Webpack, Babel, etc.).

Experience with relational and NoSQL databases (e.g., PostgreSQL, MongoDB).

Solid understanding of CI/CD, testing (Jest, Mocha), and version control (Git).

Familiarity with cloud services (AWS/GCP/Azure) and containerization (Docker, Kubernetes) is a plus.

Excellent communication and problem-solving skills.


Nice to Have:

Experience with microservices architecture.

Knowledge of GraphQL.

Exposure to serverless computing.

Prior experience working in Agile/Scrum teams.

Read more
 Zazmic Inc

Zazmic Inc

Agency job
Remote only
5 - 8 yrs
₹10L - ₹15L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
databricks
skill iconPython
SQL
+4 more

Title: Data Engineer II (Remote – India/Portugal)

Exp: 4- 8 Years

CTC: up to 30 LPA


Required Skills & Experience:

  • 4+ years in data engineering or backend software development
  • AI / ML is important
  • Expert in SQL and data modeling
  • Strong Python, Java, or Scala coding skills
  • Experience with Snowflake, Databricks, AWS (S3, Lambda)
  • Background in relational and NoSQL databases (e.g., Postgres)
  • Familiar with Linux shell and systems administration
  • Solid grasp of data warehouse concepts and real-time processing
  • Excellent troubleshooting, documentation, and QA mindset


If interested, kindly share your updated CV to 82008 31681

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Remote only
5 - 7 yrs
₹12L - ₹16L / yr
skill iconPython
Google Cloud Platform (GCP)
SQL
PySpark
Data Transformation Tool (DBT)
+2 more

Role: GCP Data Engineer

Notice Period: Immediate Joiners

Experience: 5+ years

Location: Remote

Company: Deqode


About Deqode

At Deqode, we work with next-gen technologies to help businesses solve complex data challenges. Our collaborative teams build reliable, scalable systems that power smarter decisions and real-time analytics.


Key Responsibilities

  • Build and maintain scalable, automated data pipelines using Python, PySpark, and SQL.
  • Work on cloud-native data infrastructure using Google Cloud Platform (BigQuery, Cloud Storage, Dataflow).
  • Implement clean, reusable transformations using DBT and Databricks.
  • Design and schedule workflows using Apache Airflow.
  • Collaborate with data scientists and analysts to ensure downstream data usability.
  • Optimize pipelines and systems for performance and cost-efficiency.
  • Follow best software engineering practices: version control, unit testing, code reviews, CI/CD.
  • Manage and troubleshoot data workflows in Linux environments.
  • Apply data governance and access control via Unity Catalog or similar tools.


Required Skills & Experience

  • Strong hands-on experience with PySpark, Spark SQL, and Databricks.
  • Solid understanding of GCP services (BigQuery, Cloud Functions, Dataflow, Cloud Storage).
  • Proficiency in Python for scripting and automation.
  • Expertise in SQL and data modeling.
  • Experience with DBT for data transformations.
  • Working knowledge of Airflow for workflow orchestration.
  • Comfortable with Linux-based systems for deployment and troubleshooting.
  • Familiar with Git for version control and collaborative development.
  • Understanding of data pipeline optimization, monitoring, and debugging.
Read more
Remote, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
2 - 5 yrs
₹3L - ₹9L / yr
SQL
skill iconXML
JSON
TDL

Job Description:

As a Tally Developer, your main responsibility will be to develop custom solutions in Tally using TDL as per the customer requirements. You will work closely with clients, business analysts, Senior developers, and other stakeholders to understand their requirements and translate them into effective Tally-based solutions.

Responsibilities:

Collaborate business analysts and senior developer/project manager to gather and analyses client requirements.

Design, develop, and customize Tally-based software solutions to meet the specific requirements of clients.

Write efficient and well-documented code in Tally Definition Language (TDL) to extend the functionality of Tally software.

Follow the Software Development Life Cycle including requirements gathering, design, coding, testing, and deployment.

Troubleshoot and debug issues related to Tally customization, data import/export, and software integrations.

Provide technical support and assistance to clients and end-users in utilizing and troubleshooting Tally-based software solutions.

Stay updated with the latest features and updates in Tally software to leverage new functionalities in solution development.

Adhere to coding standards, documentation practices, and quality assurance processes.

Requirements:

Any Degree. Relevant work experience may be considered in place of a degree.

Experience in Tally development and customization for projects using Tally Definition Language (TDL).

Hands-on experience in Tally and implementation of its features.

Familiarity with database systems, data structures, and SQL for efficient data management and retrieval.

Strong problem-solving skills and attention to detail.

Good communication and teamwork abilities.

Continuous learning mindset to keep up with advancements in Tally software and related technologies.

Key Skills Required:

TDL (Tally Definition Language), Tally, Excel, XML/JSON.

Good to have Basic Skills:

Database like MS SQL, MySQL

API Integration.

WORK EXPERIENCE- MINIMUM 2 YEARS AND MAXIMUM 7 YEARS

Interested candidate may what's app their cv on TRIPLE NINE ZERO NINE THREE DOUBLE ONE DOUBLE FOURE.


Please answer the below question?

Do you have knowledge of Tally Definition Language?

How many experience do you have as TDL?

Read more
Pattem Digital Technologies
Sanchari Sharma
Posted by Sanchari Sharma
Remote only
5 - 10 yrs
₹10L - ₹20L / yr
SQL
adobe campaign classic tool

The Consultant / Senior Consultant – Adobe Campaign is a technical role that requires providing Consulting advice and support to Clients for Implementing Adobe Campaign solution and any technical advisory required afterwards. This is a client-facing role and requires consultant to liaise with the client, understand their technical and business requirements and then Implement Adobe Campaign solution in a manner client gets most value out of the solution. Consultant’s main objective is to drive successful delivery and maintaining a high level of satisfaction for our customer.

What you need to succeed

• Expertise and Experience in SQL (Oracle / SQL Server / PostgreSQL) • Programming experience (Javascript / Java / VB / C# / PHP)

• Knowledge on Web Technologies like HTML, CSS would be a plus

• Good communication skills to ensure effective customer interactions, communications, and documentation

• Self-starter - Organized and highly motivated

• Fast learner, ability to learn new technologies/languages

• Knowledge of HTML DOM manipulation and page load events a plus

• Project Management skills a plus

• Ability to develop creative solutions to problems

• Able to multi-task in a dynamic environment

• Able to work independently with minimal supervision

• Experience leading team members will be a plus Adobe is an equal opportunity/affirmative action employer. We welcome and encourage diversity in the workplace.


Read more
Cymetrix Software

at Cymetrix Software

2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Remote only
3 - 9 yrs
₹12L - ₹24L / yr
Looker
lookML
bigquery
SQL
Google Cloud Platform (GCP)

Proficient in Looker Action, Looker Dashboarding, Looker Data Entry, LookML, SQL Queries, BigQuery, LookML, Looker Studio, BigQuery, GCP.



Remote Working

2 pm to 12 am IST or

10:30 AM to 7:30 PM IST

Sunday to Thursday



Responsibilities:

● Create and maintain LookML code, which defines data models, dimensions, measures, and relationships within Looker.

● Develop reusable LookML components to ensure consistency and efficiency in report and dashboard creation.

● Build and customize dashboard to Incorporate data visualizations, such as charts and graphs, to present insights effectively.

● Write complex SQL queries when necessary to extract and manipulate data from underlying databases and also optimize SQL queries for performance.

● Connect Looker to various data sources, including databases, data warehouses, and external APIs.

● Identify and address bottlenecks that affect report and dashboard loading times and Optimize Looker performance by tuning queries, caching strategies, and exploring indexing options.

● Configure user roles and permissions within Looker to control access to sensitive data & Implement data security best practices, including row-level and field-level security.

● Develop custom applications or scripts that interact with Looker's API for automation and integration with other tools and systems.

● Use version control systems (e.g., Git) to manage LookML code changes and collaborate with other developers.

● Provide training and support to business users, helping them navigate and use Looker effectively.

● Diagnose and resolve technical issues related to Looker, data models, and reports.


Skills Required:

● Experience in Looker's modeling language, LookML, including data models, dimensions, and measures.

● Strong SQL skills for writing and optimizing database queries across different SQL databases (GCP/BQ preferable)

● Knowledge of data modeling best practices

● Proficient in BigQuery, billing data analysis, GCP billing, unit costing, and invoicing, with the ability to recommend cost optimization strategies.

● Previous experience in Finops engagements is a plus

● Proficiency in ETL processes for data transformation and preparation.

● Ability to create effective data visualizations and reports using Looker’s dashboard tools.

● Ability to optimize Looker performance by fine-tuning queries, caching strategies, and indexing.

● Familiarity with related tools and technologies, such as data warehousing (e.g., BigQuery ), data transformation tools (e.g., Apache Spark), and scripting languages (e.g., Python).

Read more
Remote only
3 - 5 yrs
₹4L - ₹7L / yr
Netsuite
SOAP
Suite Script
SuiteScript2.0
ODBC
+1 more

Job Summary:

SiGa Systems are looking for a skilled and motivated Software Developer with expertise in NetSuite API and ODBC integrations. The ideal candidate will design, develop, and maintain robust data integration solutions to seamlessly move data between NetSuite and external database systems. This role demands a deep understanding of NetSuite’s data model, SuiteTalk APIs, ODBC connectivity, and strong programming skills for data manipulation and integration.

  • Key Responsibilities:1. NetSuite API DevelopmentDesign and implement custom integrations using NetSuite SuiteTalk REST and SOAP APIs.
  • Develop efficient, scalable scripts using SuiteScript 1.0 and 2.x.
  • Build and maintain Suitelets, Scheduled Scripts, User Event Scripts, and other custom NetSuite components.
  • Troubleshoot and resolve issues related to NetSuite API connections and data workflows.
  • 2. ODBC Data IntegrationSet up and manage ODBC connections for accessing NetSuite data.
  • Write complex SQL queries and stored procedures for ETL (Extract, Transform, Load) processes.
  • Design and execute data synchronization workflows between NetSuite and external databases (e.g., SQL Server, MySQL, PostgreSQL).
  • Ensure optimal performance and data accuracy across systems.
  • 3. Data Modeling & Database ManagementAnalyze NetSuite data models and design efficient schemas for target systems.
  • Perform data mapping, transformation, and migration tasks.
  • Ensure data consistency and integrity throughout integration pipelines.
  • Monitor database performance and maintain system reliability.
  • 4. Software Development & DocumentationWrite clean, maintainable, and well-documented code.
  • Participate in code reviews and contribute to coding best practices.
  • Maintain technical documentation, including API specs, integration flows, and data mapping docs.
  • Use version control systems (e.g., Git) for collaboration and code management.
  • 5. Collaboration & CommunicationWork closely with business analysts, project managers, and cross-functional teams to understand integration requirements.
  • Provide technical guidance and regular progress updates to stakeholders.
  • Participate actively in Agile development processes and contribute to sprint planning and retrospectives.


Read more
Agivant Technologies

Agivant Technologies

Agency job
via Vidpro Consultancy Services by ashik thahir
Remote only
5 - 10 yrs
₹18L - ₹25L / yr
skill iconPython
SQL
Airflow
Snowflake
skill iconElastic Search
+3 more

Experience: 5-8 Years

Work Mode: Remote

Job Type: Fulltime

Mandatory Skills: Python,SQL, Snowflake, Airflow, ETL, Data Pipelines, Elastic Search, & AWS.


Role Overview:

We are looking for a talented and passionate Senior Data Engineer to join our growing data team. In this role, you will play a key part in building and scaling our data infrastructure, enabling data-driven decision-making across the organization. You will be responsible for designing, developing, and maintaining efficient and reliable data pipelines for both ELT (Extract, Load, Transform) and ETL (Extract, Transform, Load) processes.


Responsibilities:

  • Design, develop, and maintain robust and scalable data pipelines for ELT and ETL processes, ensuring data accuracy, completeness, and timeliness.
  • Work with stakeholders to understand data requirements and translate them into efficient data models and pipelines.
  • Build and optimize data pipelines using a variety of technologies, including Elastic Search, AWS S3, Snowflake, and NFS.
  • Develop and maintain data warehouse schemas and ETL/ELT processes to support business intelligence and analytics needs.
  • Implement data quality checks and monitoring to ensure data integrity and identify potential issues.
  • Collaborate with data scientists and analysts to ensure data accessibility and usability for various analytical purposes.
  • Stay current with industry best practices, CI/CD/DevSecFinOps, Scrum and emerging technologies in data engineering.
  • Contribute to the development and enhancement of our data warehouse architecture

Required Skills:

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience as a Data Engineer with a strong focus on ELT/ETL processes.
  • At least 3+ years of exp in Snowflake data warehousing technologies.
  • At least 3+ years of exp in creating and maintaining Airflow ETL pipelines.
  • Minimum 3+ years of professional level experience with Python languages for data manipulation and automation.
  • Working experience with Elastic Search and its application in data pipelines.
  • Proficiency in SQL and experience with data modelling techniques.
  • Strong understanding of cloud-based data storage solutions such as AWS S3.
  • Experience working with NFS and other file storage systems.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.


Read more
Sun King

at Sun King

2 candid answers
1 video
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote only
1yr+
Best in industry
skill iconJava
skill iconSpring Boot
J2EE
Microservices
Hibernate (Java)
+6 more

About Sun King

Sun King is the world’s leading off-grid solar energy company, delivering energy access to 1.8 billion people without reliable grid connections through innovative product design, fintech solutions, and field operations.

Key highlights:

  • Connected over 20 million homes to solar power across Africa and Asia, adding 200,000 homes monthly.
  • Affordable ‘pay-as-you-go’ financing model; after 1-2 years, customers own their solar equipment.
  • Saved customers over $4 billion to date.
  • Collect 650,000 daily payments via 28,000 field agents using mobile money systems.
  • Products range from home lighting to high-energy appliances, with expansion into clean cooking, electric mobility, and entertainment.

With 2,800 staff across 12 countries, our team includes experts in various fields, all passionate about serving off-grid communities.

Diversity Commitment:

44% of our workforce are women, reflecting our commitment to gender diversity.


About the role:

The Backend Developer works remotely as part of the technology team to help Sun King’s EasyBuy business unit design and develop software to improve its field team operations.


What you will be expected to do

  • Design and develop applications/systems based on wireframes and product requirements documents. 
  • Design and develop logical and physical data models to meet application requirements. 
  • Identify and resolve bottlenecks and bugs based on operational requirements.
  • Perform unit tests on code to ensure robustness, including edge cases, usability, and general reliability. 
  • Write reusable and easily maintainable code following the principles of DRY (Don’t Repeat Yourself). 
  • Integrate existing tools and business systems, both in-house and external services, such as ticketing software and communication tools. 
  • Collaborate with team members and product managers to understand project requirements and contribute to the overall system design. 


You might be a strong candidate if you have/are

  • Have development experience: 1-2 years backend development experience and have strong problem-solving abilities, proficiency in data structures, and algorithms. 
  • Have a profound grasp of object-oriented programming (OOPS) standards and expertise in Core Java. 
  • Have knowledge of SQL, MySQL, or similar database management. 
  • Have Experience in integrating web services, such as SOAP, REST, JSON, and XML. 
  • Have familiarity with RESTful APIs for linking Android applications to backend services. 
  • Have preferred experience with version control systems like Git, but not mandatory. 
  • Have additional knowledge of web technologies like HTML, CSS, JavaScript, and frameworks like Spring or Hibernate would be advantageous. 


What we offer (in addition to compensation and statutory benefits):

  • A platform for professional growth in a rapidly expanding, high-impact sector.
  • Immerse in a collaborative culture, energized by employees of Sun King who are collectively motivated by fostering a transformative, sustainable venture.
  • A genuinely global environment: Engage and learn alongside a diverse group from varied geographies and backgrounds.
  • Tailored learning pathways through the Sun King Center for Leadership to elevate your leadership and managerial capabilities.
Read more
HashRoot
Agency job
via HashRoot by Deepak S
Remote only
4 - 15 yrs
₹6L - ₹15L / yr
Windows Azure
DevOps
SQL
Shell Scripting
Bash
+2 more

Overview

As an engineer in the Service Operations division, you will be responsible for the day-to-day management of the systems and services that power client products. Working with your team, you will ensure daily tasks and activities are successfully completed and where necessary, use standard operating procedures and knowledge to resolve any faults/errors encountered.


Job Description

Key Tasks and Responsibilities:

Ensure daily tasks and activities have successfully completed. Where this is not the case, recovery and remediation steps will be undertaken.

Undertake patching and upgrade activities in support of ParentPay compliance programs. These being PCI DSS, ISO27001 and Cyber Essentials+.

Action requests from the ServiceNow work queue that have been allocated to your relevant resolver group. These include incidents, problems, changes and service requests.

Investigate alerts and events detected from the monitoring systems that indicate a change in component health.

Create and maintain support documentation in the form of departmental wiki and ServiceNow knowledge articles that allow for continual improvement of fault detection and recovery times.

Work with colleagues to identify and champion the automation of all manual interventions undertaken within the team.

Attend and complete all mandatory training courses.

Engage and own the transition of new services into Service Operations.

Participate in the out of hours on call support rota.


Qualifications and Experience:

Experience working in an IT service delivery or support function OR

MBA or Degree in Information Technology or Information Security.

Experience working with Microsoft technologies.

Excellent communication skills developed working in a service centric organisation.

Ability to interpret fault descriptions provided by customers or internal escalations and translate these into resolutions.

Ability to manage and prioritise own workload.

Experience working within Education Technology would be an advantage.


Technical knowledge:

Advanced automation scripting using Terraform and Powershell. 

Knowledge of bicep and ansible advantageous. 

Advanced Microsoft Active Directory configuration and support.

Microsoft Azure and AWS cloud hosting platform administration.

Advanced Microsoft SQL server experience.

Windows Server and desktop management and configuration.

Microsoft IIS web services administration and configuration.

Advanced management of data and SQL backup solutions.

Advanced scripting and automation capabilities. 

Advanced knowledge of Azure analytics and KQL.

 

Skills & Requirements

IT Service Delivery, Information Technology, Information Security, Microsoft Technologies, Communication Skills, Fault Interpretation, Workload Prioritization, Automation Scripting, Terraform, PowerShell, Microsoft Active Directory, Microsoft Azure, AWS, Microsoft SQL Server, Windows Server, Windows Desktop Configuration, Microsoft IIS, Data Backup Management, SQL Backup Solutions, Scripting, Azure Analytics, KQL.

Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Remote only
5 - 9 yrs
₹10L - ₹18L / yr
SQL
T-SQL
SQL Server Reporting Services (SSRS)
Query optimization
SQL Query Analyzer
+1 more


Job Summary:

We are seeking an experienced MS-SQL Server Database Developer with over 5+ years of hands-on experience in creating and managing database objects such as stored procedures, functions, triggers, and SSRS (Reports). The ideal candidate will be responsible for ensuring optimal performance, maintenance, and development of MS-SQL databases, including backup and restoration processes.

Key Responsibilities:

- Design, develop, and maintain stored procedures, functions, and triggers to support application functionality.

- Create and optimize complex queries for efficient data retrieval and manipulation.

- Develop reports using SQL Server Reporting Services (SSRS) or other reporting tools.

- Perform database backup, restoration, and recovery as per defined processes.

- Troubleshoot database-related issues and provide solutions for improving performance.

- Collaborate with development and QA teams to ensure database solutions meet business requirements.

- Ensure the security and integrity of data across all database environments.

Required Skills & Qualifications:

- 5+ years of experience working with MS-SQL Server (2016 or later).

- Strong expertise in writing complex T-SQL queries, stored procedures, functions, and triggers.

- Experience with report generation using SSRS or other reporting tools.

- Hands-on experience in backup, restoration, and database recovery processes.

- Familiarity with performance tuning and optimization techniques.

- Ability to work in a fast-paced environment and manage multiple tasks efficiently.

- Strong problem-solving and troubleshooting skills.

Role: Data Engineer

Industry Type: IT Services & Consulting

Department: Engineering - Software & QA

Employment Type: Full Time, 6M - 1Yr

Role Category: Software Development

Education

UG: BCA in Any Specialization, B.Tech/B.E. in Any Specialization

PG: M.Tech in Any Specialization


Read more
Adesso India

Adesso India

Agency job
via HashRoot by Deepak S
Remote only
4 - 15 yrs
₹8L - ₹25L / yr
skill iconPython
SQL
skill iconMongoDB
bigquery
skill iconJava

Overview

Adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.

Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.


Job Description

We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.


Responsibilities:

Development and maintenance of data pipelines and automation scripts with Python.

Creation of data queries and optimization of database processes with SQL.

Use of bash scripts for system administration, automation and deployment processes.

Database and cloud technologies.

Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake).

Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular.

Composer (Airflow): Orchestration of data pipelines for ETL processes.

Cloud Functions: Development of serverless functions for data processing and automation.

Cloud Scheduler: Planning and automation of recurring cloud jobs.

Cloud Secret Manager: Secure storage and management of sensitive access data and API keys.

BigQuery: Processing, analyzing and querying large amounts of data in the cloud.

Cloud Storage: Storage and management of structured and unstructured data.

Cloud monitoring: monitoring the performance and stability of cloud-based applications.

Data visualization and reporting.

Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI.


Requirements:

Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.

Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.

Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP.

Combined with cloud storage technologies, cloud monitoring and cloud secret management

Excellent communication skills to effectively collaborate with team members and stakeholders.


Nice-to-Have:

 

Knowledge of agile methodologies and working in cross-functional, collaborative teams.

Skills & Requirements

SQL, BigQuery, GCP, Python, MongoDB, Exasol, Snowflake, Bash scripting, Airflow, Cloud Functions, Cloud Scheduler, Cloud Secret Manager, Cloud Storage, Cloud Monitoring, ETL, Data Pipelines, Power BI, Database Optimization, Cloud-Based BI Solutions, Data Processing, Data Automation, Agile Methodologies, Cross-Functional Collaboration.

Read more
Adesso India
Remote only
4 - 15 yrs
₹10L - ₹27L / yr
skill iconPython
Google Cloud Platform (GCP)
SQL
skill iconMongoDB
skill iconJava

Immediate Joiners Preferred. Notice Period - Immediate to 30 Days


Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".

Only applications received via email will be reviewed. Applications through other channels will not be considered.


About Us

adesso India is a dynamic and innovative IT Services and Consulting company based in Kochi. We are committed to delivering cutting-edge solutions that make a meaningful impact on our clients. As we continue to expand our development team, we are seeking a talented and motivated Backend Developer to join us in creating scalable and high-performance backend systems.


Job Description

We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.


Responsibilities

  • Development and maintenance of data pipelines and automation scripts with Python
  • Creation of data queries and optimization of database processes with SQL
  • Use of bash scripts for system administration, automation and deployment processes
  • Database and cloud technologies
  • Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake)
  • Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular
  • Composer (Airflow): Orchestration of data pipelines for ETL processes
  • Cloud Functions: Development of serverless functions for data processing and automation
  • Cloud Scheduler: Planning and automation of recurring cloud jobs
  • Cloud Secret Manager: Secure storage and management of sensitive access data and API keys
  • BigQuery: Processing, analyzing and querying large amounts of data in the cloud
  • Cloud Storage: Storage and management of structured and unstructured data
  • Cloud monitoring: monitoring the performance and stability of cloud-based applications
  • Data visualization and reporting
  • Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI


Requirements

  • Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.
  • Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.
  • Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP
  • Combined with cloud storage technologies, cloud monitoring and cloud secret management
  • Excellent communication skills to effectively collaborate with team members and stakeholders.

Nice-to-Have:

  • Knowledge of agile methodologies and working in cross-functional, collaborative teams.
Read more
Adesso India

Adesso India

Agency job
via HashRoot by Deepak S
Remote only
3 - 11 yrs
₹6L - ₹27L / yr
Data engineering
Data architecture
skill iconAmazon Web Services (AWS)
Windows Azure
Data Transformation Tool (DBT)
+3 more

Overview

adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.

Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.


Job Description

We are seeking a skilled Cloud Data Engineer who has experience with cloud data platforms like AWS or Azure and especially Snowflake and dbt to join our dynamic team. As a consultant, you will be responsible for developing new data platforms and create the data processes. You will collaborate with cross-functional teams to design, develop, and deploy high-quality frontend solutions. 


Responsibilities:

Customer consulting: You develop data-driven products in the Snowflake Cloud and connect data & analytics with specialist departments. You develop ELT processes using dbt (data build tool) 

Specifying requirements: You develop concrete requirements for future-proof cloud data architectures.

Develop data routes: You design scalable and powerful data management processes.

Analyze data: You derive sound findings from data sets and present them in an understandable way.


Requirements:

Requirements management and project experience: You successfully implement cloud-based data & analytics projects.

Data architectures: You are proficient in DWH/data lake concepts and modeling with Data Vault 2.0.

Cloud expertise: You have extensive knowledge of Snowflake, dbt and other cloud technologies (e.g. MS Azure, AWS, GCP).

SQL know-how: You have a sound and solid knowledge of SQL.

Data management: You are familiar with topics such as master data management and data quality.

Bachelor's degree in computer science, or a related field.

Strong communication and collaboration abilities to work effectively in a team environment.

 

Skills & Requirements

Cloud Data Engineering, AWS, Azure, Snowflake, dbt, ELT processes, Data-driven consulting, Cloud data architectures, Scalable data management, Data analysis, Requirements management, Data warehousing, Data lake, Data Vault 2.0, SQL, Master data management, Data quality, GCP, Strong communication, Collaboration.

Read more
Adesso India

Adesso India

Agency job
via HashRoot by Deepak S
Remote only
5 - 12 yrs
₹10L - ₹25L / yr
J2EE
JPA
EJB
JAAS
SAML
+7 more

Overview

Adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.

Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.


Job Description

The current application landscape features multiple Java web services running on JEE application servers, primarily hosted on AWS, and integrated with various systems such as SAP, other services, and external partners. DPS is committed to delivering the best digital work experience for the customers employees and customers alike.


Responsibilities:

Independent front- and backend implementation of business functionalities, defined as user stories by the customer, considering the cost-value ratio and maintenance effort for the customer 

Implementation of user stories and incidents, including concept, implementation (including automated unit tests), and communication with the customer within the agile development process.

Database activities such as creation or modification of database schema as well as implementation of database access, queries, and data modification 

Interface realization based on standard principles like REST or SOAP

Implementation of given Identity and Access Management Patterns for securing the application

Analysis and resolution of issues (3rd Level Support).

Documentation of the implementation.

Consultancy in technical and business topics within the applications.

Usage of selected tools for implementation, testing, rollout, and support.

Participation in regular meetings with the client to track the status of assigned tasks.


Requirements:

Experience with JEE technologies such as JPA, EJB, CDI, JAAS, and SAML.

Experience in technology related to JEE, like Maven.

Proficiency in HTML5, CSS, Angular, and Bootstrap.

Strong knowledge of SQL.

Experience with web services (SOAP, REST, JSON).


Skills & Requirements

JEE, JPA, EJB, CDI, JAAS, SAML, Maven, HTML5, CSS, Angular, Bootstrap, SQL, SOAP, REST, JSON, Database schema design, Unit testing, Agile development, Identity and Access Management (IAM), Troubleshooting, Documentation, Third-level support.

Read more
Adesso India
Remote only
5 - 20 yrs
₹10L - ₹25L / yr
skill iconJava
skill iconAngular (2+)
JPA
EJB
JAAS
+9 more

Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".

Only applications received via email will be reviewed. Applications through other channels will not be considered.


Overview

Adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.

Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.


Job Description

The current application landscape features multiple Java web services running on JEE application servers, primarily hosted on AWS, and integrated with various systems such as SAP, other services, and external partners. DPS is committed to delivering the best digital work experience for the customers employees and customers alike.


Responsibilities:

Independent front- and backend implementation of business functionalities, defined as user stories by the customer, considering the cost-value ratio and maintenance effort for the customer 

Implementation of user stories and incidents, including concept, implementation (including automated unit tests), and communication with the customer within the agile development process.

Database activities such as creation or modification of database schema as well as implementation of database access, queries, and data modification 

Interface realization based on standard principles like REST or SOAP

Implementation of given Identity and Access Management Patterns for securing the application

Analysis and resolution of issues (3rd Level Support).

Documentation of the implementation.

Consultancy in technical and business topics within the applications.

Usage of selected tools for implementation, testing, rollout, and support.

Participation in regular meetings with the client to track the status of assigned tasks.


Requirements:

Experience with JEE technologies such as JPA, EJB, CDI, JAAS, and SAML.

Experience in technology related to JEE, like Maven.

Proficiency in HTML5, CSS, Angular, and Bootstrap.

Strong knowledge of SQL.

Experience with web services (SOAP, REST, JSON).


Skills & Requirements

JEE, JPA, EJB, CDI, JAAS, SAML, Maven, HTML5, CSS, Angular, Bootstrap, SQL, SOAP, REST, JSON, Database schema design, Unit testing, Agile development, Identity and Access Management (IAM), Troubleshooting, Documentation, Third-level support.



Read more
Mayura Consultancy Services
Remote only
3 - 4 yrs
₹3.5L - ₹5.5L / yr
skill iconHTML/CSS
skill iconPHP
skill iconJavascript
skill iconBootstrap
skill iconCodeIgniter
+2 more

Position: Full Stack Developer ( PHP Codeigniter)

Company : Mayura Consultancy Services

Experience: 3 yr To 4 yrs

Location : Bangalore

Skill: HTML, CSS, Bootstrap, Javascript, Ajax, Jquery , PHP and Codeigniter or CI

Work Location: Work From Home(WFH)

Website : https://www.mayuraconsultancy.com/


Requirements :

  • Prior experience in Full Stack Development using PHP Codeigniter


Perks of Working with MCS :

  • Contribute to Innovative Solutions: Join a dynamic team at the forefront of software development, contributing to innovative projects and shaping the technological solutions of the organization.
  • Work with Clients from across the Globe: Collaborate with clients from around the world, gaining exposure to diverse cultures and industries, and contributing to the development of solutions that address the unique needs and challenges of global businesses.
  • Complete Work From Home Opportunity: Enjoy the flexibility of working entirely from the comfort of your home, empowering you to manage your schedule and achieve a better work-life balance while coding innovative solutions for MCS.
  • Opportunity to Work on Projects Developing from Scratch: Engage in projects from inception to completion, working on solutions developed from scratch and having the opportunity to make a significant impact on the design, architecture, and functionality of the final product.
  • Diverse Projects: Be involved in a variety of development projects, including web applications, mobile apps, e-commerce platforms, and more, allowing you to showcase your versatility as a Full Stack Developer and expand your portfolio.


Joining MCS as a Full Stack Developer opens the door to a world where your technical skills can shine and grow, all while enjoying a supportive and dynamic work environment. We're not just building solutions; we're building the future—and you can be a key part of that journey.



Read more
Koantek
Bhoomika Varshney
Posted by Bhoomika Varshney
Remote only
4 - 8 yrs
₹10L - ₹30L / yr
skill iconPython
databricks
SQL
Spark
PySpark
+3 more

The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive

modern data engineering techniques and methods with Advanced Analytics to support

business decisions for our clients. Your goal is to support the use of data-driven insights

to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and

patterns, insights, and trends to decision-makers. You will help design and build data

pipelines, data streams, reporting tools, information dashboards, data service APIs, data

generators, and other end-user information portals and insight tools. You will be a critical

part of the data supply chain, ensuring that stakeholders can access and manipulate data

for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and

communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:

 Strong experience as an AWS/Azure/GCP Data Engineer and must have

AWS/Azure/GCP Databricks experience.  Expert proficiency in Spark Scala, Python, and spark

 Must have data migration experience from on-prem to cloud

 Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos

 In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics

solutions on Azure.  Expert level hands-on development Design and Develop applications on Databricks.  Extensive hands-on experience implementing data migration and data processing

using AWS/Azure/GCP services

 In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib

 Hands-on experience with the Technology stack available in the industry for data

management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc

 Hands-on knowledge of data frameworks, data lakes and open-source projects such

asApache Spark, MLflow, and Delta Lake

 Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]

 Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair

 Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep

learning, etc

 Demonstrated experience preparing data, automating and building data pipelines for

AI Use Cases (text, voice, image, IoT data etc. ).  Good to have programming language experience with. NET or Spark/Scala

 Experience in creating tables, partitioning, bucketing, loading and aggregating data

using Spark Scala, Spark SQL/PySpark

 Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools

and processes including Git, Jenkins, Jira, and Confluence

 Working experience with Visual Studio, PowerShell Scripting, and ARM templates.  Able to build ingestion to ADLS and enable BI layer for Analytics

 Strong understanding of Data Modeling and defining conceptual logical and physical

data models.  Big Data/analytics/information analysis/database management in the cloud

 IoT/event-driven/microservices in the cloud- Experience with private and public cloud

architectures, pros/cons, and migration considerations.  Ability to remain up to date with industry standards and technological advancements

that will enhance data quality and reliability to advance strategic initiatives


 Working knowledge of RESTful APIs, OAuth2 authorization framework and security

best practices for API Gateways

 Guide customers in transforming big data projects, including development and

deployment of big data and AI applications

 Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed

 2+ years of hands-on experience designing and implementing multi-tenant solutions


using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-

time data warehouse, and machine learning solutions.  Over all 5+ years' experience in a software development, data engineering, or data


analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies.  hands-on expertise in Apache SparkTM (Scala or Python)

 3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions.  Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience

 Ability to manage competing priorities in a fast-paced environment

 Ability to resolve issues

 Basic experience with or knowledge of agile methodologies

 AWS Certified: Solutions Architect Professional

 Databricks Certified Associate Developer for Apache Spark

 Microsoft Certified: Azure Data Engineer Associate

 GCP Certified: Professional Google Cloud Certified

Read more
Cookie Analyix
Gajalakshmi K
Posted by Gajalakshmi K
Remote only
7 - 16 yrs
₹5L - ₹15L / yr
SQL
ASP.NET
skill iconC#
Web API
Entity Framework
+1 more

Job Title: Senior .NET Developer (Remote)



Experience: 7+ Years
Location: India (Remote)


Apply:jobs[at]cookieanalytix[dot]net



Company Overview:
We are a dynamic and rapidly growing tech company looking for a talented Senior .NET Developer to join our team. As a Senior .NET Developer, you will work with a team of skilled professionals to build high-quality, scalable applications and contribute to the development of innovative software solutions.


Key Responsibilities:


* Design, develop, and maintain high-performance applications using .NET technologies.

* Collaborate with cross-functional teams to define, design, and ship new features.

* Write clean, maintainable, and efficient code.

* Optimize and troubleshoot SQL queries for database performance.

* Work with large, complex data sets and ensure the integrity of the database.

* Participate in code reviews and contribute to best practices for software development.

* Continuously improve development processes and implement automation where possible.

* Ensure all applications meet quality standards and provide excellent user experiences.


Required Skills and Experience:


* 7+ years of experience as a .NET Developer with a proven track record of designing and developing scalable applications.

* Strong proficiency in ASP.Net, C#, Web API, React/ Angular, Entity Framework.

* Solid experience with SQL Server, including database design, writing optimized queries, and performance tuning.

* Excellent problem-solving skills and the ability to debug and optimize complex systems.

* Strong communication skills with the ability to work independently and as part of a team in a remote setting.



Why Join Us?


* Work from the comfort of your own home in a fully remote environment.

* Opportunity to work on innovative and challenging projects.

* Collaborative and inclusive work culture.




Read more
DataToBiz Pvt. Ltd.

at DataToBiz Pvt. Ltd.

2 recruiters
Shrishti Sharma
Posted by Shrishti Sharma
Remote only
3 - 4 yrs
₹4L - ₹8L / yr
Tableau
Spotfire
Qlikview
PowerBI
Data Visualization
+4 more
  • Develop & Maintain Dashboards: Create interactive and visually compelling dashboards and reports in Tableau, ensuring they meet business requirements and provide actionable insights.
  • Data Analysis & Visualization: Design data models and build visualizations to summarize large sets of data, ensuring accuracy, consistency, and clarity in the reports.
  • SQL Querying: Write complex SQL queries to extract and transform data from different data sources (databases, APIs, etc.), ensuring optimal performance.
  • Data Cleansing: Clean, validate, and prepare data for analysis, ensuring data integrity and consistency.
  • Collaboration: Work closely with cross-functional teams, including business analysts, data engineers, and stakeholders, to gather requirements and deliver customized reporting solutions.
  • Troubleshooting & Support: Provide technical support and troubleshooting for Tableau reports, dashboards, and data integration issues.
  • Performance Optimization: Optimize Tableau workbooks, dashboards, and queries for better performance and scalability.
  • Best Practices: Ensure Tableau development follows best practices for data visualization, performance optimization, and user experience.
Read more
Global Tech startup

Global Tech startup

Agency job
via Recruit Square by Priyanka choudhary
Remote only
2 - 8 yrs
$5K - $9K / yr
skill iconPython
skill iconMongoDB
SQL
skill iconDjango
skill iconFlask
+3 more

We are a dynamic and innovative technology company dedicated to delivering cutting-edge solutions that empower businesses and individuals. As we continue to grow and expand our offerings, we are seeking a coding fanatic, who is interested in working on and learning new technologies. 

Position - Backend developer 

Job type - Freelance or on contract 

Location - Remote 


Roles and Responsibilities:

  • Plan,create and test REST APIs for back-end services such as authentication and authorization.
  • Deploy and maintain backend systems on the cloud.
  • Research and develop solutions for real life business problems.
  • Creating and maintaining our apps' essential business logic, providing correct data processing and flawless user experiences.
  • Database design, implementation, and management, including schema design, query optimisation and data integrity.
  • Participating in code reviews, providing constructive input, and ensuring that code quality standards are met.
  • Keep abreast of industry developments and best practices to bring new solutions to our initiatives.

Required skills and experience - 

Must have skills : -

  • Bachelor’s degree in computer programming, computer science, or a related field.
  • 3 + years of experience in backend development.
  • Proficient in Python,Mongodb,postgres/sql,Django and Flask
  • Knowledge on nginx.
  • C++/C +Cython for creating python modules
  • Knowledge on Redis
  • Familiarity with using AI provider apis and prompt engineering
  • Experience in working with linux based instances on the cloud.
  • Strong problem solving and verbal and written communication skills.
  • Ability to work independently or with a group.

Good to have skills :- 

  • Experience in node.js and Java 
  • AWS and Google cloud knowledge.


Read more
Cyahlo
Aditya Amberkar
Posted by Aditya Amberkar
Remote only
0 - 1 yrs
₹1L - ₹1.2L / yr
skill iconJava
skill iconSpring Boot
SQL
API
skill iconPostman
+1 more

🚨 Hiring Alert 🚨


We are Hiring Java Backend Intern for 2 months !


Skills Required:


1. Good understanding of Java 17, Spring and any Sql database

2. Good Understanding on designing low level code from scratch

3. Experience in building database schema and code architecture

4. Familiar with design patterns and willingness to write clean, readable, and well-documented code.

5. Familiarity with tools like postman, STS or intelij

6. Understanding of REST APIs and their role in application development.

7. Good DSA and problem solving skills


Roles & Responsibilities:

1. Assist in developing and maintaining web applications.

2. Learn to utilize open source tools for integration

3. Collaborate with team members to design and implement new features.

4. Contribute to optimizing application performance and resolving bugs.

5. Stay curious and keep learning new technologies relevant to spring boot and spring reactive

6. Exposure to version control systems like Git.

7. Passion for learning and contributing to real-world projects.


Preferred Qualifications:

1. Min exp of 0-2 years.

2. Skills in computer science/IT and relevant.


What You’ll Gain:

1. Stipend - 8k -10k/ month, subjective to your performance

2. Hands-on experience in building production-grade applications.

Read more
This is for technology company

This is for technology company

Agency job
via TalentGPT Consulting Pvt Ltd by Bhumika Dixit
Remote only
6 - 10 yrs
₹10L - ₹20L / yr
Camunda
RESTful APIs
06692
skill icon.NET
skill iconJava
+3 more

Roles and Responsibilities:

  • 6+ years of IT experience with 3+ years in Camunda development; Camunda certification required.
  • Expertise in designing, developing, and implementing Camunda components like Job Workers and Process Models, following best practices.
  • Proficient in integrations with external systems using REST APIs, connectors, web services, and experience in building REST services with Spring Boot or .NET.
  • Hands-on experience integrating Camunda with Front End, Streaming Products, PostGres, SMTP, SAP, and RPA systems; strong SQL query and function writing skills.
  • Experienced in deploying solutions via Bitbucket and GIT, maintaining documentation, and participating in code reviews to ensure quality and compliance.
  • Skilled in tracking and resolving CRs/Defects through JIRA and providing technical support for UAT/PROD environments.


Read more
Deltek
Damodharan J
Posted by Damodharan J
Remote only
8 - 15 yrs
Best in industry
skill icon.NET
ASP.NET
skill iconJavascript
Powershell
RESTful APIs
+3 more

Qualifications

  • Bachelor's Degree or equivalent experience is required
  • A minimum of 8-10 years of experience working in an IT environment
  • At least 6-8 years of experience as a developer.
  • Experience leading complex projects/tasks
  • Strong hands-on experience with a variety of code, interface, API, and database concepts
  • Experience in full project life cycle development for systems and applications
  • Experience integrating and customizing business applications
  • Excellent oral and written communication skills, with the ability to collaborate effectively with team members at all levels, from junior developers to senior managers and business customers
  • A strong desire to learn the latest technologies and the ability to acquire knowledge quickly
  • Exceptional problem-solving and debugging skills


Technical Knowledge & Skills

  • Proficiency in programming languages such as JavaScript, .NET, C#, Java, PHP, PowerShell, and Regex
  • Experience with ERP, HRIS, and CRM and Case Management systems
  • Experience with ServiceNow (SNOW) glide extensions and JavaScript
  • Experience with Microsoft SharePoint
  • Proficiency in web services, Rest APIs, string manipulation, SQL, and creating stored procedures using XML/JSON and SOAP/REST protocols
  • Experience with integrations using tools such as Power Automate, UiPath, MuleSoft, and IBM App Connect
  • Possess knowledge of DBA tasks, including deployments, maintenance planning, string manipulation, and authoring SQL statements
  • Experience using Source Code and Version Control Systems like Git
  • Experience in cloud base deployment of applications
  • Windows OS


Read more
Cookie Analyix
Gajalakshmi K
Posted by Gajalakshmi K
Remote only
7 - 13 yrs
₹5L - ₹17L / yr
skill icon.NET
SQL

Sr .Net Developer


#REMOTE# India



What We’re Looking For:

7+ years of experience in .net and sql

Strong problem-solving skills and attention to detail.

Ability to work collaboratively in a team and remotely.


Why Join Us?

Flexible work-from-home options.

Competitive salary based on experience and skills.

Opportunity to work on innovative projects with the latest technologies.

A culture that values growth, learning, and collaboration.



Read more
Experiencecom
Remote only
7 - 12 yrs
₹20L - ₹35L / yr
Google Cloud Platform (GCP)
Big Data
skill iconPython
SQL
pandas
+3 more

Description


Come Join Us


Experience.com - We make every experience matter more

Position: Senior GCP Data Engineer

Job Location: Chennai (Base Location) / Remote

Employment Type: Full Time


Summary of Position

A Senior Data Engineer is a professional who specializes in preparing big data infrastructure for analytical or operational uses. He/She is responsible for develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity. They collaborate with data scientists and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organisation.


Responsibilities:

  • Collaborate with cross-functional teams to define, prioritize, and execute data engineering initiatives aligned with business objectives.
  • Design and implement scalable, reliable, and secure data solutions by industry best practices and compliance requirements.
  • Drive the adoption of cloud-native technologies and architectural patterns to optimize the performance, cost, and reliability of data pipelines and analytics solutions.
  • Mentor and lead a team of Data Engineers.
  • Demonstrate a drive to learn and master new technologies and techniques.
  • Apply strong problem-solving skills with an emphasis on building data-driven or AI-enhanced products.
  • Coordinate with ML/AI and engineering teams to understand data requirements.


Experience & Skills:

  • 8+ years of Strong experience in ETL and ELT data from various sources in Data Warehouses
  • 8+ years of experience in Python, Pandas, Numpy, and SciPy.
  • 5+ years of Experience in GCP 
  • 5+ years of Experience in BigQuery, PySpark, and Pub/Sub
  • 5+ years of Experience working with and creating data architectures.
  • Certified in Google Cloud Professional Data Engineer.
  • Advanced proficiency in Google Cloud services such as Dataflow, Dataproc, Dataprep, Data Studio, and Cloud Composer.
  • Proficient in writing complex Spark (PySpark) User Defined Functions (UDFs), Spark SQL, and HiveQL.
  • Good understanding of Elastic search.
  • Experience in assessing and ensuring data quality, data testing, and addressing data quality issues.
  • Excellent understanding of Spark architecture and underlying frameworks including storage management.
  • Solid background in database design and development, database administration, and software engineering across full life cycles.
  • Experience with NoSQL data stores like MongoDB, DocumentDB, and DynamoDB.
  • Knowledge of data governance principles and practices, including data lineage, metadata management, and access control mechanisms.
  • Experience in implementing and optimizing data security controls, encryption, and compliance measures in GCP environments.
  • Ability to troubleshoot complex issues, perform root cause analysis, and implement effective solutions in a timely manner.
  • Proficiency in data visualization tools such as Tableau, Looker, or Data Studio to create insightful dashboards and reports for business users.
  • Strong communication and interpersonal skills to effectively collaborate with technical and non-technical stakeholders, articulate complex concepts, and drive consensus.
  • Experience with agile methodologies and project management tools like Jira or Asana for sprint planning, backlog grooming, and task tracking.


Read more
sagarsoft (india) ltd
Snehalatha Makam
Posted by Snehalatha Makam
Remote only
7 - 12 yrs
₹20L - ₹30L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Microservices
+1 more

Required Skills and Experience:

Proficient in Java (Java 8 and above), with a strong understanding of object-oriented programming.

Knowledge in the trading domain, including familiarity with trading systems and protocols.

Strong skills in SQL and PL/SQL for database management and query optimization.

Hands-on experience with Linux and Windows operating systems for application deployment and maintenance.

Proficiency in scripting languages (e.g., Bash, PowerShell, or similar).

Knowledge of Python programming for auxiliary development and analytics tasks.

Familiarity with multithreading, concurrency, and low-latency application development.

Experience with CI/CD pipelines, version control systems (e.g., Git), and deployment workflows.

Read more
Data Caliper
Sweety Silvester
Posted by Sweety Silvester
Remote, Coimbatore, Chennai, Bengaluru (Bangalore)
10 - 15 yrs
₹10L - ₹15L / yr
Selenium
Cucumber
Manual testing
Software Testing (QA)
Automation
+3 more

We are currently seeking skilled and Innovative QA Automation lead to join our dynamic team. As a QA Lead, you will be Responsible for Automation test planning, product test strategy, Create automation test scripts to verify and validate the quality of the product.


Join DataCaliper and step into the vanguard of technological advancement, where your proficiency will shape the landscape of data management and drive businesses toward unparalleled success.


Please find below our job description, if interested apply / reply sharing your profile to connect and discuss.


Company: Data caliper

URL: https://datacaliper.com/

Work location: Coimbatore( Remote)

Experience: 10+ Years

Joining time: Immediate – 4 weeks

Required skills:

  • Good experience with Selenium, Cucumber or any other automation tools.
  • Good experience in Selenium Based automation of Web, Mobile, and Desktop applications.
  • Very good written and oral business communication and presentation skills required
  • Basic SQL knowledge is required
  • Experience in one test management and one defect management tool is required
  • Should be aware of STLC and testing processes
  • Good attitude and communication skills are required
  • Willing to learn and stretch during the ramp-up period as he/she should be hands-on very quickly.
  • Hands-on experience in Agile projects


Thank you


Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort