Cutshort logo
Remote sql jobs

50+ Remote SQL Jobs in India

Apply to 50+ Remote SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRpgSQL JobsRpgSQL Jobs in HyderabadSQL JobsSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
Verix

at Verix

5 candid answers
1 video
Eman Khan
Posted by Eman Khan
Remote only
6 - 10 yrs
₹15L - ₹34L / yr
Large Language Models (LLM)
SEO analytics
Ads analytics
Search
SQL
+8 more

What is OptimizeGEO?

OptimizeGEO is Verix’s flagship platform that helps brands stay visible, cited, and trusted in AI-powered answers. Unlike traditional SEO that optimizes for keywords and rankings, OptimizeGEO operationalizes AEO/GEO principles so brands are discoverable across generative systems (ChatGPT, Gemini, Claude, Perplexity) and answer engines (featured snippets, voice assistants, AI answer boxes).


Role Overview

We are building the next generation of measurement systems for Generative Engine Optimization (AEO/GEO)—defining how “quality,” “trust,” and “impact” are quantified in an AI-first discovery landscape. As Measurement Lead, you will architect the measurement strategy, frameworks, and data models that inform product development, customer outcomes, and go-to-market effectiveness. You’ll combine experimentation, analytics, and generative AI evaluation to create a durable, decision-grade measurement stack.


Key Responsibilities

  • Define Measurement Frameworks: Operationalize KPIs for AEO performance - model visibility/coverage, ranking/recall, trust & attribution signals, and downstream engagement/ROI.
  • Own the Measurement Stack: Partner with Data Engineering to build systems for A/B and multivariate testing, offline/online model evaluation, and longitudinal tracking of AEO metrics.
  • Model & Content Evaluation: Establish benchmarks and scoring systems for generative output quality, factuality, and attribution, leveraging both human-in-the-loop and automated evaluation.
  • Cross-Functional Alignment: Drive shared definitions and measurement standards across Product, Data Science, Customer Success, and GTM.
  • Insight to Action: Translate raw data into clear recommendations that improve product performance and customer ROI; create exec-ready narratives that tie measurement to business outcomes.
  • Thought Leadership: Be the internal SME on measurement in the generative era; evangelize best practices and influence roadmaps through storytelling with data.


Qualifications (Minimum)

  • 6–10 years in analytics, data science, experimentation, or measurement - ideally in search, ads, LLM evaluation, or content optimization.
  • Proven experience designing metric frameworks and experimentation systems for complex or multi-sided products.
  • Deep understanding of AI/LLM evaluation and/or SEO/ads analytics; familiarity with offline vs. online metrics and counterfactuals.
  • Advanced proficiency in SQL and Python/R; hands-on with tools such as Amplitude, Mixpanel, Looker/Looker Studio, dbt, BigQuery/Snowflake, and experiment platforms.
  • Demonstrated ability to connect analytical rigor to strategic decisions; strong communication and stakeholder influence skills.


Preferred Experience

  • Background in search quality, ads measurement, or model eval (e.g., BLEU, ROUGE, BERTScore, factuality/trustworthiness).
  • Experience with human evaluation ops, prompt and data set design, and rubric development for LLMs.
  • Prior experience in startup or new product incubation environments.


What Success Looks Like (Outcomes)

  • Launch the industry’s first AEO Quality Score and reference measurement model.
  • Deliver visibility frameworks that tie AI discoverability → content optimization → commercial ROI.
  • Establish a robust experimentation and evaluation pipeline that accelerates product velocity and elevates customer outcomes.
  • Be recognized as the go-to expert on generative measurement internally and externally.


Ways of Working / Tooling (indicative)

SQL, Python/R, Experiment platforms, Amplitude/Mixpanel, Looker/Looker Studio, BigQuery/Snowflake, dbt, Airflow, prompt-evaluation tooling, annotation platforms, dashboards for exec reporting.


Equal Opportunity

Virtualness is an equal opportunity employer. We celebrate diversity and are committed to an inclusive environment for all employees.

Read more
Grey Chain Technology

at Grey Chain Technology

5 candid answers
Pratikshya Pusty
Posted by Pratikshya Pusty
Remote only
3 - 6 yrs
₹6L - ₹8L / yr
Functional testing
SQL
API

Job Description

3-5 years of hands-on experience in manual testing involving functional, non-functional, regression, and integration testing in a structured environment.

Candidate should have exceptional communication skills.

Should have minimum 1 year work experience in data comparison testing.

Experience in testing web-based applications.

Able to define the scope of testing.

Experience in testing large-scale solutions integrating multiple source and target systems.

Experience in API testing.

Experience in Database verification using SQL queries.

Experience working in an Agile team.

Should be able to attend Agile ceremonies in UK hours.

Having a good understanding of Data Migration projects will be a plus.

Read more
Quanteon Solutions
DurgaPrasad Sannamuri
Posted by DurgaPrasad Sannamuri
Remote only
8 - 12 yrs
₹20L - ₹26L / yr
skill icon.NET
ASP.NET
skill iconC#
skill iconAngular (2+)
skill iconJavascript
+7 more

We are seeking a highly skilled and experienced Senior Full Stack Developerwith 8+years of experience to join our dynamic team. The ideal candidate will have a strong background in both front-end and back-end development, with expertise in .NET, Angular, TypeScript, Azure, SQL Server, Agile methodologies, and Design Patterns. Experience with DocuSign is a plus.


Responsibilities:

  • Design, develop, and maintain web applications using .NET, Angular, and TypeScript.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Implement and maintain cloud-based solutions using Azure.
  • Develop and optimize SQL Server databases.
  • Follow Agile methodologies to manage project tasks and deliverables.
  • Apply design patterns and best practices to ensure high-quality, maintainable code.
  • Troubleshoot and resolve software defects and issues.
  • Mentor and guide junior developers.

Requirements:

  • Bachelor's degree in computer science, Engineering, or a related field.
  • Proven experience as a Full Stack Developer or similar role.
  • Strong proficiency in .NET, Angular, and TypeScript.
  • Experience with Azure cloud services.
  • Proficient in SQL Server and database design.
  • Familiarity with Agile methodologies and practices.
  • Solid understanding of design patterns and software architecture principles.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and teamwork abilities.
  • Experience with DocuSign is a plus.


Read more
Remote only
10 - 15 yrs
₹25L - ₹40L / yr
data engineer
Apache Spark
skill iconScala
Big Data
skill iconPython
+5 more

What You’ll Be Doing:

● Own the architecture and roadmap for scalable, secure, and high-quality data pipelines

and platforms.

● Lead and mentor a team of data engineers while establishing engineering best practices,

coding standards, and governance models.

● Design and implement high-performance ETL/ELT pipelines using modern Big Data

technologies for diverse internal and external data sources.

● Drive modernization initiatives including re-architecting legacy systems to support

next-generation data products, ML workloads, and analytics use cases.

● Partner with Product, Engineering, and Business teams to translate requirements into

robust technical solutions that align with organizational priorities.

● Champion data quality, monitoring, metadata management, and observability across the

ecosystem.

● Lead initiatives to improve cost efficiency, data delivery SLAs, automation, and

infrastructure scalability.

● Provide technical leadership on data modeling, orchestration, CI/CD for data workflows,

and cloud-based architecture improvements.


Qualifications:

● Bachelor's degree in Engineering, Computer Science, or relevant field.

● 8+ years of relevant and recent experience in a Data Engineer role.

● 5+ years recent experience with Apache Spark and solid understanding of the

fundamentals.

● Deep understanding of Big Data concepts and distributed systems.

● Demonstrated ability to design, review, and optimize scalable data architectures across

ingestion.

● Strong coding skills with Scala, Python and the ability to quickly switch between them with

ease.

● Advanced working SQL knowledge and experience working with a variety of relational

databases such as Postgres and/or MySQL.

● Cloud Experience with DataBricks.


● Strong understanding of Delta Lake architecture and working with Parquet, JSON, CSV,

and similar formats.

● Experience establishing and enforcing data engineering best practices, including CI/CD

for data, orchestration and automation, and metadata management.

● Comfortable working in an Agile environment

● Machine Learning knowledge is a plus.

● Demonstrated ability to operate independently, take ownership of deliverables, and lead

technical decisions.

● Excellent written and verbal communication skills in English.

● Experience supporting and working with cross-functional teams in a dynamic

environment.

REPORTING: This position will report to Sr. Technical Manager or Director of Engineering as

assigned by Management.

EMPLOYMENT TYPE: Full-Time, Permanent


SHIFT TIMINGS: 10:00 AM - 07:00 PM IST

Read more
Remote only
6 - 15 yrs
₹25L - ₹45L / yr
Data engineering
Data Engineer
skill iconScala
PySpark
Apache Spark
+14 more

Shift: 2:00 PM – 11:00 PM IST

Experience: 6+ years of hands-on Data Engineering experience


About the Role:

We are looking for experienced Data Engineers who can design, build, and optimize large-scale data pipelines. This role is for individual contributors who love coding, problem-solving, and working with cutting-edge big data technologies.


Key Responsibilities:

  • Design, develop, and maintain scalable data pipelines for ETL processes using Spark (PySpark/Scala).
  • Optimize data flow and automate existing processes for improved performance.
  • Collaborate with cross-functional teams — Data Science, Analytics, and Product — to support data infrastructure needs.
  • Work on data quality, reliability, and performance improvements.
  • Handle large datasets from multiple sources (structured & unstructured).


Required Skills & Experience:

  • 6+ years of experience as a Data Engineer working in big data environments.
  • Strong programming experience in Scala and/or PySpark.
  • Hands-on experience with Apache SparkDatabricks, and cloud-based data architectures.
  • Solid knowledge of SQL and relational databases (Postgres, MySQL, etc.).
  • Experience with file formats like Parquet, Delta Tables, CSV, JSON.
  • Comfortable working in a Linux shell environment.
  • Strong communication and problem-solving skills.


Good to Have:

  • Experience in machine learning data pipelines.
  • Working knowledge of Agile environments and CI/CD data workflows.


Read more
Deltek
Remote only
4 - 7 yrs
Best in industry
skill icon.NET
skill iconC#
SQL
Artificial Intelligence (AI)
Web Development
+3 more


Sr Software Engineer

Company Summary :

As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference.

At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com

Business Summary :

The Deltek Engineering and Technology team builds best-in-class solutions to delight customers and meet their business needs. We are laser-focused on software design, development, innovation and quality. Our team of experts has the talent, skills and values to deliver products and services that are easy to use, reliable, sustainable and competitive. If you're looking for a safe environment where ideas are welcome, growth is supported and questions are encouraged – consider joining us as we explore the limitless opportunities of the software industry.

Position Responsibilities :

About the Role

We are looking for a skilled and motivated Senior Software Developer to join our team responsible for developing and maintaining a robust ERP solution used by approximately 400 customers and more than 30000 users worldwide. The system is built using C# (.NET Core), leverages SQL Server for data management, and is hosted in the Microsoft Azure cloud.

This role offers the opportunity to work on a mission-critical product, contribute to architectural decisions, and help shape the future of our cloud-native ERP platform.

Key Responsibilities

  • Design, develop, and maintain features and modules within the ERP system using C# (.NET Core)
  • Optimize and manage SQL Server database interactions for performance and scalability
  • Collaborate with cross-functional teams, including QA, DevOps, Product Management, and Support
  • Participate in code reviews, architecture discussions, and technical planning
  • Contribute to the adoption and improvement of CI/CD pipelines and cloud deployment practices
  • Troubleshoot and resolve complex technical issues across the stack
  • Ensure code quality, maintainability, and adherence to best practices
  • Stay current with emerging technologies and recommend improvements where applicable

Qualifications

  • Curiosity, passion, teamwork, and initiative
  • Strong experience with C# and .NET Core in enterprise application development
  • Solid understanding of SQL Server, including query optimization and schema design
  • Experience with Azure cloud services (App Services, Azure SQL, Storage, etc.)
  • Ability to utilize agentic AI as a development support, with a critical thinking attitude
  • Familiarity with agile development methodologies and DevOps practices
  • Ability to work independently and collaboratively in a fast-paced environment
  • Excellent problem-solving and communication skills
  • Master's degree in Computer Science or equivalent; 5+ years of relevant work experience
  • Experience with ERP systems or other complex business applications is a plus

What We Offer

  • A chance to work on a product that directly impacts thousands of users worldwide
  • A collaborative and supportive engineering culture
  • Opportunities for professional growth and technical leadership
  • Competitive salary and benefits package

Read more
Remote only
1 - 3 yrs
₹3L - ₹5L / yr
skill iconHTML/CSS
skill iconJavascript
skill iconBootstrap
skill iconPHP
skill iconCodeIgniter
+1 more

Position: Full Stack Developer ( PHP Codeigniter)

Company : Mayura Consultancy Services

Experience: 2 yrs

Location : Bangalore

Skill: HTML, CSS, Bootstrap, Javascript, Ajax, Jquery , PHP and Codeigniter or CI

Work Location: Work From Home(WFH)

Website : https://www.mayuraconsultancy.com/


Requirements :

  • Prior experience in Full Stack Development using PHP Codeigniter


Perks of Working with MCS :

  • Contribute to Innovative Solutions: Join a dynamic team at the forefront of software development, contributing to innovative projects and shaping the technological solutions of the organization.
  • Work with Clients from across the Globe: Collaborate with clients from around the world, gaining exposure to diverse cultures and industries, and contributing to the development of solutions that address the unique needs and challenges of global businesses.
  • Complete Work From Home Opportunity: Enjoy the flexibility of working entirely from the comfort of your home, empowering you to manage your schedule and achieve a better work-life balance while coding innovative solutions for MCS.
  • Opportunity to Work on Projects Developing from Scratch: Engage in projects from inception to completion, working on solutions developed from scratch and having the opportunity to make a significant impact on the design, architecture, and functionality of the final product.
  • Diverse Projects: Be involved in a variety of development projects, including web applications, mobile apps, e-commerce platforms, and more, allowing you to showcase your versatility as a Full Stack Developer and expand your portfolio.


Joining MCS as a Full Stack Developer opens the door to a world where your technical skills can shine and grow, all while enjoying a supportive and dynamic work environment. We're not just building solutions; we're building the future—and you can be a key part of that journey.


Read more
NeoGenCode Technologies Pvt Ltd
Ritika Verma
Posted by Ritika Verma
Remote only
4 - 7 yrs
₹8L - ₹11L / yr
skill iconPHP
skill iconLaravel
SQL
MySQL
Object Oriented Programming (OOPs)

Job Title: PHP Coordinator / Laravel Developer

Experience: 4+ Years

Work Mode: Work From Home (WFH)

Working Days: 5 Days


Job Description:

We are looking for an experienced PHP Coordinator / Laravel Developer to join our team. The ideal candidate should have strong expertise in PHP and Laravel framework, along with the ability to coordinate and manage development tasks effectively.

Key Responsibilities:

  • Develop, test, and maintain web applications using PHP and Laravel.
  • Coordinate with team members to ensure timely project delivery.
  • Write clean, secure, and efficient code.
  • Troubleshoot, debug, and optimize existing applications.
  • Collaborate with stakeholders to gather and analyze requirements.

Required Skills:

  • Strong experience in PHP and Laravel framework.
  • Good understanding of MySQL and RESTful APIs.
  • Familiarity with front-end technologies (HTML, CSS, JavaScript).
  • Excellent communication and coordination skills.
  • Ability to work independently in a remote environment.



Read more
Intineri infosol Pvt Ltd

at Intineri infosol Pvt Ltd

2 candid answers
Adil Saifi
Posted by Adil Saifi
Remote only
8 - 12 yrs
₹3L - ₹15L / yr
ODI
ETL
Oracle
ELT
Oracle Data Integrator
+2 more

Shift: 9PM IST to 6 AM IST

Experience: 8 + Years


Requirements


We are seeking a skilled and experienced Data Integration Specialist with over 5 years of experience in designing and developing data solutions using Oracle Data Integrator (ODI). The ideal candidate will have strong expertise in data modeling, ETL/ELT processes, and SQL, along with exposure to Python scripting for API-based data ingestion.


Key Responsibilities:


Design, develop, and maintain functions and stored procedures using Oracle Data Integrator (ODI).

Create and document data warehouse schemas, including fact and dimension tables, based on business requirements.

Develop and execute SQL scripts for table creation and collaborate with Database Administrators (DBAs) for deployment.

Analyze various data sources to identify relationships and align them with Business Requirements Documentation (BRD).

Design and implement Extract, Load, Transform (ELT) processes to load data from source systems into staging and target environments.

Validate and profile data using Structured Query Language (SQL) and other analytical tools to ensure data accuracy and completeness.

Apply best practices in data governance, including query optimization, metadata management, and data quality monitoring.

Demonstrate strong data modeling skills to support scalable and efficient data architecture.

Utilize Python to automate data collection from APIs, enhancing integration workflows and enabling real-time data ingestion.

Investigate and resolve data quality issues through detailed analysis and root cause identification.

Communicate effectively with stakeholders through strong written, verbal, and analytical skills.

Exhibit excellent problem-solving and research capabilities in a fast-paced, data-driven environment.

Read more
Zonecheck
Niranjan G
Posted by Niranjan G
Remote, Prabhadevi
1 - 2 yrs
₹2.5L - ₹3L / yr
skill iconReact Native
skill iconReact.js
skill iconJavascript
SQL

Tech Stack / Requirements:

  1. Experience required: 1 - 2 yrs atleast
  2. Candidates must be from an IT Engineering background (B.E./B.Tech in Information Technology, Computer Science, or related fields), B.Sc. IT, BCA or related fields.
  3. Strong understanding of JavaScript
  4. Experience with React Native / Expo
  5. Familiarity with SQL
  6. Exposure to REST APIs integration
  7. Fast learner with strong problem-solving & debugging skills


Responsibilities:

  1. Build & improve mobile app features using React Native / Expo
  2. Develop and maintain web features using React.js / Next.js
  3. Integrate APIs and ensure seamless user experiences across platforms
  4. Collaborate with backend & design teams for end-to-end development
  5. Debug & optimize performance across mobile and web
  6. Write clean, maintainable code and ship to production regularly


Work closely with the founding team / CTO and contribute to product launches


Growth: Performance-based growth with significant hikes possible in the same or upcoming months.

Read more
Remote only
6 - 10 yrs
₹8L - ₹15L / yr
Informatica IICS/IDMC
Informatica PowerCenter
ETL
SQL
Data migration
+1 more

Job Title : Informatica Cloud Developer / Migration Specialist

Experience : 6 to 10 Years

Location : Remote

Notice Period : Immediate


Job Summary :

We are looking for an experienced Informatica Cloud Developer with strong expertise in Informatica IDMC/IICS and experience in migrating from PowerCenter to Cloud.

The candidate will be responsible for designing, developing, and maintaining ETL workflows, data warehouses, and performing data integration across multiple systems.


Mandatory Skills :

Informatica IICS/IDMC, Informatica PowerCenter, ETL Development, SQL, Data Migration (PowerCenter to IICS), and Performance Tuning.


Key Responsibilities :

  • Design, develop, and maintain ETL processes using Informatica IICS/IDMC.
  • Work on migration projects from Informatica PowerCenter to IICS Cloud.
  • Troubleshoot and resolve issues related to mappings, mapping tasks, and taskflows.
  • Analyze business requirements and translate them into technical specifications.
  • Conduct unit testing, performance tuning, and ensure data quality.
  • Collaborate with cross-functional teams for data integration and reporting needs.
  • Prepare and maintain technical documentation.

Required Skills :

  • 4 to 5 years of hands-on experience in Informatica Cloud (IICS/IDMC).
  • Strong experience with Informatica PowerCenter.
  • Proficiency in SQL and data warehouse concepts.
  • Good understanding of ETL performance tuning and debugging.
  • Excellent communication and problem-solving skills.
Read more
KDK Softwares

at KDK Softwares

1 recruiter
Priyanka Khandelwal
Posted by Priyanka Khandelwal
Remote, Jaipur
5 - 10 yrs
₹5L - ₹12L / yr
SQL
skill iconJavascript
skill iconAmazon Web Services (AWS)
SQL Azure
HTTP
+1 more

Role & responsibilities


  • Develop and maintain server-side applications using Go Lang.
  • Design and implement scalable, secure, and maintainable RESTful APIs and microservices.
  • Collaborate with front-end developers to integrate user-facing elements with server-side logic
  • Optimize applications for performance, reliability, and scalability.
  • Write clean, efficient, and reusable code that adheres to best practices.


Preferred candidate profile


  • Minimum 5 years of working experience in Go Lang development.
  • Proven experience in developing RESTful APIs and microservices.
  • Familiarity of cloud platforms like AWS, GCP, or Azure.
  • Familiarity with CI/CD pipelines and DevOps practices


Read more
Sun King

at Sun King

2 candid answers
1 video
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote only
3yrs+
Best in industry
skill iconJava
skill iconSpring Boot
Microservices
SQL
SOAP
+7 more

About the role:

The SDE 2 - Backend will work as part of the Digitization and Automation team to help Sun King design, develop, and implement - intelligent, tech-enabled solutions to help solve a large variety of our business problems. We are looking for candidates with an affinity for technology and automations, curiosity towards advancement in products, and strong coding skills for our in-house software development team.


What you will be expected to do:

  • Design and build applications/systems based on wireframes and product requirements documents
  • Design and develop conceptual and physical data models to meet application requirements. 
  • Identify and correct bottlenecks/bugs according to operational requirements
  • Focus on scalability, performance, service robustness, and cost trade-offs.
  • Create prototypes and proof-of-concepts for iterative development.
  • Take complete ownership of projects (end to end) and their development cycle
  • Mentoring and guiding team members
  • Unit test code for robustness, including edge cases, usability and general reliability
  • Integrate existing tools and business systems (in-house tools or business tools like Ticketing softwares, communication tools) with external services
  • Coordinate with the Product Manager, development team & business analysts

You might be a strong candidate if you have/are:

  • Development experience: 3 – 5 years
  • Should be very strong in problem-solving, data structures, and algorithms.
  • Deep knowledge of OOPS concepts and programming skills in Core Java and Spring Boot Framework
  • Strong Experience in SQL
  • Experience in web service development and integration (SOAP, REST, JSON, XML)
  • Understanding of code versioning tools (e.g., git)
  • Experience in Agile/Scrum development process and tools
  • Experience in Microservice architecture
  • Hands-on experience in AWS RDS, EC2, S3 and deployments

Good to have:

  • Knowledge on messaging systems RabbitMQ, Kafka.
  • Knowledge of Python
  • Container-based application deployment (Docker or equivalent)
  • Willing to learn new technologies and implement them in products


What Sun King offers:

  • Professional growth in a dynamic, rapidly expanding, high-social-impact industry
  • An open-minded, collaborative culture made up of enthusiastic colleagues who are driven by the challenge of innovation towards profound impact on people and the planet.
  • A truly multicultural experience: you will have the chance to work with and learn from people from different geographies, nationalities, and backgrounds.
  • Structured, tailored learning and development programs that help you become a better leader, manager, and professional through the Sun King Center for Leadership.


About Sun King

Sun King is a leading off-grid solar energy company providing affordable, reliable electricity to 1.8 billion people without grid access. Operating across Africa and Asia, Sun King has connected over 20 million homes, adding 200,000 homes monthly.

Through a ‘pay-as-you-go’ model, customers make small daily payments (as low as $0.11) via mobile money or cash, eventually owning their solar equipment and saving on costly kerosene or diesel. To date, Sun King products have saved customers over $4 billion.

With 28,000 field agents and embedded electronics that regulate usage based on payments, Sun King ensures seamless energy access. Its products range from home lighting and phone charging systems to solar inverters capable of powering high-energy appliances.

Sun King is expanding into clean cooking, electric mobility, and entertainment while serving a wide range of income segments.

The company employs 2,800 staff across 12 countries, with women representing 44% of the workforce, and expertise spanning product design, data science, logistics, sales, software, and operations.

Read more
Remote only
10 - 17 yrs
₹20L - ₹30L / yr
Apache Spark
Big Data
skill iconScala
skill iconPython
databricks
+1 more

Position: Senior Data Engineer


Overview:

We are seeking an experienced Senior Data Engineer to design, build, and optimize scalable data pipelines and infrastructure to support cross-functional teams and next-generation data initiatives. The ideal candidate is a hands-on data expert with strong technical proficiency in Big Data technologies and a passion for developing efficient, reliable, and future-ready data systems.


Reporting: Reports to the CEO or designated Lead as assigned by management.

Employment Type: Full-time, Permanent

Location: Remote (Pan India)

Shift Timings: 2:00 PM – 11:00 PM IST


Key Responsibilities:

  • Design and develop scalable data pipeline architectures for data extraction, transformation, and loading (ETL) using modern Big Data frameworks.
  • Identify and implement process improvements such as automation, optimization, and infrastructure re-design for scalability and performance.
  • Collaborate closely with Engineering, Product, Data Science, and Design teams to resolve data-related challenges and meet infrastructure needs.
  • Partner with machine learning and analytics experts to enhance system accuracy, functionality, and innovation.
  • Maintain and extend robust data workflows and ensure consistent delivery across multiple products and systems.


Required Qualifications:

  • Bachelor’s degree in Computer Science, Engineering, or related field.
  • 10+ years of hands-on experience in Data Engineering.
  • 5+ years of recent experience with Apache Spark, with a strong grasp of distributed systems and Big Data fundamentals.
  • Proficiency in Scala, Python, Java, or similar languages, with the ability to work across multiple programming environments.
  • Strong SQL expertise and experience working with relational databases such as PostgreSQL or MySQL.
  • Proven experience with Databricks and cloud-based data ecosystems.
  • Familiarity with diverse data formats such as Delta Tables, Parquet, CSV, and JSON.
  • Skilled in Linux environments and shell scripting for automation and system tasks.
  • Experience working within Agile teams.
  • Knowledge of Machine Learning concepts is an added advantage.
  • Demonstrated ability to work independently and deliver efficient, stable, and reliable software solutions.
  • Excellent communication and collaboration skills in English.



About the Organization:

We are a leading B2B data and intelligence platform specializing in high-accuracy contact and company data to empower revenue teams. Our technology combines human verification and automation to ensure exceptional data quality and scalability, helping businesses make informed, data-driven decisions.


What We Offer:

Our workplace embraces diversity, inclusion, and continuous learning. With a fast-paced and evolving environment, we provide opportunities for growth through competitive benefits including:

  • Paid Holidays and Leaves
  • Performance Bonuses and Incentives
  • Comprehensive Medical Policy
  • Company-Sponsored Training Programs

We are an Equal Opportunity Employer, committed to maintaining a workplace free from discrimination and harassment. All employment decisions are made based on merit, competence, and business needs.

Read more
Automate Accounts

at Automate Accounts

2 candid answers
Namrata Das
Posted by Namrata Das
Remote only
2 - 4 yrs
₹5L - ₹10L / yr
zoho
skill iconPython
skill iconNodeJS (Node.js)
SQL
skill iconDocker
+1 more

Responsibilities


Develop and maintain web and backend components using Python, Node.js, and Zoho tools


Design and implement custom workflows and automations in Zoho


Perform code reviews to maintain quality standards and best practices


Debug and resolve technical issues promptly


Collaborate with teams to gather and analyze requirements for effective solutions


Write clean, maintainable, and well-documented code


Manage and optimize databases to support changing business needs


Contribute individually while mentoring and supporting team members


Adapt quickly to a fast-paced environment and meet expectations within the first month



Selection Process


1. HR Screening: Review of qualifications and experience


2. Online Technical Assessment: Test coding and problem-solving skills


3. Technical Interview: Assess expertise in web development, Python, Node.js, APIs, and Zoho


4. Leadership Evaluation: Evaluate team collaboration and leadership abilities


5. Management Interview: Discuss cultural fit and career opportunities


6. Offer Discussion: Finalize compensation and role specifics



Experience Required


2-4 years of relevant experience as a Zoho Developer


Proven ability to work as a self-starter and contribute individually


Strong technical and interpersonal skills to support team members effectively



Read more
Netroadshow

Netroadshow

Agency job
via MWIDM by Priyanka Maurya
Remote only
5 - 9 yrs
₹22L - ₹30L / yr
skill icon.NET
skill iconC#
Microservices
ASP.NET
Unit testing
+2 more

Required Skills: 

  • 4+ years of experience designing, developing, and implementing enterprise-level, n-tier, software solutions.  
  • Proficiency with Microsoft C# is a must.  
  • In-depth experience with .NET framework and .NET Core.  
  • Knowledge of OOP, server technologies, and SOA is a must. 3+ Years Micro-service experience .  
  • Relevant experience with database design and SQL (Postgres is preferred).  
  • Experience with ORM tooling.  
  • Experience delivering software that is correct, stable, and security compliant.  
  • Basic understanding of common cloud platform. (Good to have)
  • Financial services experience is strongly preferred.  
  • Thorough understanding of XML/JSON and related technologies.  
  • Thorough understanding of unit, integration, and performance testing for APIs.   
  • Entrepreneurial spirit. You are self-directed, innovative, and biased towards action. You love to build new things and thrive in fast-paced environments.  
  • Excellent communication and interpersonal skills, with an emphasis on strong writing and analytical problem-solving. 



Read more
Nyx Wolves
Remote only
5 - 7 yrs
₹11L - ₹13L / yr
SQL
Data modeling
Web performance optimization
Data engineering

Now Hiring: Tableau Developer (Banking Domain) 🚀

We’re looking for a 6+ years experienced Tableau pro to design and optimize dashboards for Banking & Financial Services.


🔹 Design & optimize interactive Tableau dashboards for large banking datasets

🔹 Translate KPIs into scalable reporting solutions

🔹 Ensure compliance with regulations like KYC, AML, Basel III, PCI-DSS

🔹 Collaborate with business analysts, data engineers, and banking experts

🔹 Bring deep knowledge of SQL, data modeling, and performance optimization


🌍 Location: Remote

📊 Domain Expertise: Banking / Financial Services


✨ Preferred experience with cloud data platforms (AWS, Azure, GCP) & certifications in Tableau are a big plus!


Bring your data visualization skills to transform banking intelligence & compliance reporting.


Read more
Remote only
0 - 0 yrs
₹3000 - ₹3500 / mo
SQL

About the Role

We are seeking motivated Data Engineering Interns to join our team remotely for a 3-month internship. This role is designed for students or recent graduates interested in working with data pipelines, ETL processes, and big data tools. You will gain practical experience in building scalable data solutions. While this is an unpaid internship, interns who successfully complete the program will receive a Completion Certificate and a Letter of Recommendation.

Responsibilities

  • Assist in designing and building data pipelines for structured and unstructured data.
  • Support ETL (Extract, Transform, Load) processes to prepare data for analytics.
  • Work with databases (SQL/NoSQL) for data storage and retrieval.
  • Help optimize data workflows for performance and scalability.
  • Collaborate with data scientists and analysts to ensure data quality and consistency.
  • Document workflows, schemas, and technical processes.

Requirements

  • Strong interest in data engineering, databases, and big data systems.
  • Basic knowledge of SQL and relational database concepts.
  • Familiarity with Python, Java, or Scala for data processing.
  • Understanding of ETL concepts and data pipelines.
  • Exposure to cloud platforms (AWS, Azure, or GCP) is a plus.
  • Familiarity with big data frameworks (Hadoop, Spark, Kafka) is an advantage.
  • Good problem-solving skills and ability to work independently in a remote setup.

What You’ll Gain

  • Hands-on experience in data engineering and ETL pipelines.
  • Exposure to real-world data workflows.
  • Mentorship and guidance from experienced engineers.
  • Completion Certificate upon successful completion.
  • Letter of Recommendation based on performance.

Internship Details

  • Duration: 3 months
  • Location: Remote (Work from Home)
  • Stipend: Unpaid
  • Perks: Completion Certificate + Letter of Recommendation


Read more
Data Axle

at Data Axle

2 candid answers
Eman Khan
Posted by Eman Khan
Remote, Pune
4 - 9 yrs
Best in industry
skill iconMachine Learning (ML)
skill iconPython
SQL
PySpark
XGBoost

About Data Axle:

Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for over 50 years in the USA. Data Axle now as an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business & consumer databases.


Data Axle Pune is pleased to have achieved certification as a Great Place to Work!


Roles & Responsibilities:

We are looking for a Data Scientist to join the Data Science Client Services team to continue our success of identifying high quality target audiences that generate profitable marketing return for our clients. We are looking for experienced data science, machine learning and MLOps practitioners to design, build and deploy impactful predictive marketing solutions that serve a wide range of verticals and clients. The right candidate will enjoy contributing to and learning from a highly talented team and working on a variety of projects.


We are looking for a Senior Data Scientist who will be responsible for:

  1. Ownership of design, implementation, and deployment of machine learning algorithms in a modern Python-based cloud architecture
  2. Design or enhance ML workflows for data ingestion, model design, model inference and scoring
  3. Oversight on team project execution and delivery
  4. If senior, establish peer review guidelines for high quality coding to help develop junior team members’ skill set growth, cross-training, and team efficiencies
  5. Visualize and publish model performance results and insights to internal and external audiences


Qualifications:

  1. Masters in a relevant quantitative, applied field (Statistics, Econometrics, Computer Science, Mathematics, Engineering)
  2. Minimum of 3.5 years of work experience in the end-to-end lifecycle of ML model development and deployment into production within a cloud infrastructure (Databricks is highly preferred)
  3. Proven ability to manage the output of a small team in a fast-paced environment and to lead by example in the fulfilment of client requests
  4. Exhibit deep knowledge of core mathematical principles relating to data science and machine learning (ML Theory + Best Practices, Feature Engineering and Selection, Supervised and Unsupervised ML, A/B Testing, etc.)
  5. Proficiency in Python and SQL required; PySpark/Spark experience a plus
  6. Ability to conduct a productive peer review and proper code structure in Github
  7. Proven experience developing, testing, and deploying various ML algorithms (neural networks, XGBoost, Bayes, and the like)
  8. Working knowledge of modern CI/CD methods This position description is intended to describe the duties most frequently performed by an individual in this position.


It is not intended to be a complete list of assigned duties but to describe a position level.

Read more
Cravingcode Technologies Pvt Ltd
Didhiti Dasgupta
Posted by Didhiti Dasgupta
Remote only
2 - 5 yrs
₹5L - ₹9L / yr
skill icon.NET
skill iconAngular (2+)
Web API
SQL
Entity Framework

Job Description: .NET + Angular Full Stack Developer

Position: Full Stack Developer (.NET + Angular)

Experience: 3 – 5 Years


About the Role

We are looking for a highly skilled .NET Angular Full Stack Developer to join our dynamic team. The ideal candidate should have strong expertise in both back-end and front-end development, hands-on experience with .NET Core and Angular, and a passion for building scalable, secure, and high-performance applications.


Key Responsibilities

  • Design, develop, and maintain scalable, high-quality web applications using .NET Core 8, ASP.NET MVC, Web API, and Angular 13+.
  • Build and integrate RESTful APIs and ensure seamless communication between front-end and back-end services.
  • Develop, optimize, and maintain SQL Server (2012+) databases, ensuring high availability, performance, and reliability.
  • Write complex stored procedures, functions, triggers, and perform query tuning and indexing for performance optimization.
  • Work with Entity Framework/EF Core to implement efficient data access strategies.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Implement OAuth 2.0 authentication/authorization for secure access control.
  • Write clean, testable, and maintainable code following Test-Driven Development (TDD) principles.
  • Use GIT / TFVC for version control and collaborate using Azure DevOps Services for CI/CD pipelines.
  • Participate in code reviews, troubleshoot issues, and optimize application performance.
  • Stay updated with emerging technologies and recommend improvements to enhance system architecture.


Required Technical Skills

  • 3+ years of experience in .NET development (C#, .NET Core 8, ASP.NET MVC, Web API).
  • Strong experience in SQL Server development including:
  • Query tuning, execution plan analysis, and performance optimization.
  • Designing and maintaining indexes, partitioning strategies, and database normalization.
  • Handling large datasets and optimizing stored procedures for scalability.
  • Experience with SQL Profiler, Extended Events, and monitoring tools.
  • Proficiency in Entity Framework / EF Core for ORM-based development.
  • Familiarity with PostgreSQL and cross-database integration is a plus.
  • Expertise in Angular 13+, HTML5, CSS, TypeScript, JavaScript, and Bootstrap.
  • Experience with REST APIs development and integration.
  • Knowledge of OAuth 2.0 and secure authentication methods.
  • Hands-on experience with GIT/TFVC and Azure DevOps for source control and CI/CD pipelines.
  • Basic knowledge of Node.js framework is a plus.
  • Experience with unit testing frameworks like NUnit, MSTest, etc.


Soft Skills

  • Strong problem-solving and analytical skills, particularly in debugging performance bottlenecks.
  • Excellent communication and collaboration abilities.
  • Ability to work independently and in a team environment.
  • Attention to detail and a passion for writing clean, scalable, and optimized code.


Read more
AT
Remote only
5 - 10 yrs
₹5L - ₹10L / yr
skill iconReact.js
skill iconNextJs (Next.js)
skill iconNodeJS (Node.js)
skill iconRedux/Flux
TypeScript
+4 more

Full Stack Engineer (Frontend Strong, Backend Proficient)

5-10 Years Experience

Contract: 6months+extendable

Location: Remote

Technical Requirements Frontend Expertise (Strong)


*Need at least 4 Yrs in React web developement, Node & AI.*


● Deep proficiency in React, Next.js, TypeScript

● Experience with state management (Redux, Context API)

● Frontend testing expertise (Jest, Cypress)

● Proven track record of achieving high Lighthouse performance scores Backend Proficiency

● Solid experience with Node.js, NestJS (preferred), or ExpressJS

● Database management (SQL, NoSQL)

● Cloud technologies experience (AWS, Azure)

● Understanding of OpenAI and AI integration capabilities (bonus) Full Stack Integration

● Excellent ability to manage and troubleshoot integration issues between frontend and backend systems

● Experience designing cohesive systems with proper separation of concerns

Read more
Syrencloud

at Syrencloud

3 recruiters
Sudheer Kumar
Posted by Sudheer Kumar
Remote, Hyderabad
3 - 10 yrs
₹10L - ₹30L / yr
Microsoft Fabric
ADF
Synapse
databricks
Microsoft Windows Azure
+5 more

We are seeking a highly skilled Fabric Data Engineer with strong expertise in Azure ecosystem to design, build, and maintain scalable data solutions. The ideal candidate will have hands-on experience with Microsoft Fabric, Databricks, Azure Data Factory, PySpark, SQL, and other Azure services to support advanced analytics and data-driven decision-making.


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines using Microsoft Fabric and Azure data services.
  • Implement data integration, transformation, and orchestration workflows with Azure Data Factory, Databricks, and PySpark.
  • Work with stakeholders to understand business requirements and translate them into robust data solutions.
  • Optimize performance and ensure data quality, reliability, and security across all layers.
  • Develop and maintain data models, metadata, and documentation to support analytics and reporting.
  • Collaborate with data scientists, analysts, and business teams to deliver insights-driven solutions.
  • Stay updated with emerging Azure and Fabric technologies to recommend best practices and innovations.
  • Required Skills & Experience
  • Proven experience as a Data Engineer with strong expertise in the Azure cloud ecosystem.

Hands-on experience with:

  • Microsoft Fabric
  • Azure Databricks
  • Azure Data Factory (ADF)
  • PySpark & Python
  • SQL (T-SQL/PL-SQL)
  • Solid understanding of data warehousing, ETL/ELT processes, and big data architectures.
  • Knowledge of data governance, security, and compliance within Azure.
  • Strong problem-solving, debugging, and performance tuning skills.
  • Excellent communication and collaboration abilities.

 

Preferred Qualifications

  • Microsoft Certified: Fabric Analytics Engineer Associate / Azure Data Engineer Associate.
  • Experience with Power BI, Delta Lake, and Lakehouse architecture.
  • Exposure to DevOps, CI/CD pipelines, and Git-based version control.
Read more
SupplyHouse
Susannah York
Posted by Susannah York
Remote only
3 - 6 yrs
₹22L - ₹28L / yr
skill iconJava
skill iconSpring Boot
SQL

Real people. Real service.


At SupplyHouse.com, we value every individual team member and cultivate a community where people come first. Led by our core values of Generosity, Respect, Innovation, Teamwork, and GRIT, we’re dedicated to maintaining a supportive work environment that celebrates diversity and empowers everyone to reach their full potential. As an industry-leading e-commerce company specializing in HVAC, plumbing, heating, and electrical supplies since 2004, we strive to foster growth while providing the best possible experience for our customers.


Through an Employer of Record (EOR), we are looking for a new, remote Backend Engineer in India to join our growing IT Team. This individual will report into our Full Stack Team Lead and have the opportunity to work on impactful projects that enhance our e-commerce platform and internal operations, while honing your skills in backend and full stack development. If you’re passionate about creating user-friendly interfaces, building scalable systems, and contributing to innovative solutions in a collaborative and fun environment, we’d love to hear from you! 


Role Type: Full-Time

Location: Remote from India

Schedule: Monday through Friday, 4:00 a.m. – 1:00 p.m. U.S. Eastern Time / 12:00 p.m. – 9:00 p.m. Indian Standard Time to ensure effective collaboration 

Base Salary: $25,000 - $30,000 USD per year


Responsibilities:

  • Collaborate with cross-functional teams to gather and refine requirements, ensuring alignment with business needs.
  • Design, develop, test, deploy, and maintain scalable, high-performance software applications.
  • Develop and enhance internal tools and applications to improve company operations.
  • Ensure system reliability, optimize application performance, and implement best practices for scalability.
  • Continuously improve existing codebases, conducting code reviews, and implementing modern practices.
  • Stay up to date with emerging technologies, trends, and best practices in software development.


Requirements:

  • Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field.
  • 3+ years of hands-on experience in backend and/or full-stack development with a proven track record of delivering high-quality software.

Back-End Skills:

  • Proficiency in Java and experience with back-end frameworks like Spring Boot.
  • Strong understanding of database design, RDBMS concepts, and experience with SQL.
  • Knowledge of RESTful API design and integration.

Development Lifecycle: Proven ability to contribute across the entire software development lifecycle, including planning, design, coding, testing, deployment, and maintenance.

Tools & Practices:

  • Familiarity with version control systems, like Git, and CI/CD pipelines.
  • Experience with agile development methodologies.

Additional Skills:

  • Strong problem-solving and debugging capabilities.
  • Ability to create reusable code libraries and write clean, maintainable code.
  • Strong communication and collaboration skills to work effectively within a team and across departments.
  • High-level proficiency of written and verbal communication in English.


Preferred Qualifications:

  • Proficiency in HTML5, CSS3, JavaScript (ES6+), and responsive design principles.
  • Expertise in modern JavaScript frameworks and libraries such as React, Angular, or Vue.js.
  • Experience with cross-browser compatibility and performance optimization techniques. 
  • Experience working on Frontend responsibilities such as: 
  • Designing and implementing reusable, maintainable UI components and templates.
  • Working closely with Designers to ensure technical feasibility and adherence to UI/UX design standards.
  • Managing and updating promotional banners and site-wide templates to ensure timely execution of marketing initiatives.


Why work with us: 

  • We have awesome benefits – We offer a wide variety of benefits to help support you and your loved ones. These include: Comprehensive and affordable medical, dental, vision, and life insurance options; Competitive Provident Fund contributions; Paid casual and sick leave, plus country-specific holidays; Mental health support and wellbeing program; Company-provided equipment and one-time $250 USD work from home stipend; $750 USD annual professional development budget; Company rewards and recognition program; And more!
  • We promote work-life balance – We value your time and encourage a healthy separation between your professional and personal life to feel refreshed and recharged. Look out for our 100% remote schedule and wellness initiatives! 
  • We support growth– We strive to innovate every day. In an exciting and evolving industry, we provide potential for career growth through our hands-on training, access to the latest technologies and tools, diversity and inclusion initiatives, opportunities for internal mobility, and professional development budget. 
  • We give back –We live and breathe our core value, Generosity, by giving back to the trades and organizations around the world. We make a difference through donation drives, employee-nominated contributions, support for DE&I organizations, and more.  
  • We listen – We value hearing from our employees. Everyone has a voice, and we encourage you to use it! We actively elicit feedback through our monthly town halls, regular 1:1 check-ins, and company-wide ideas form to incorporate suggestions and ensure our team enjoys coming to work every day. 


Check us out and learn more at https://www.supplyhouse.com/our-company


Additional Details: 

  • Remote employees are expected to work in a distraction-free environment. Personal devices, background noise, and other distractions should be kept to a minimum to avoid disrupting virtual meetings or business operations.   
  • SupplyHouse.com is an Equal Opportunity Employer, strongly values inclusion, and encourages individuals of all backgrounds and experiences to apply for this position. 
  • To ensure fairness, all application materials, assessments, and interview responses must reflect your own original work. The use of AI tools, plagiarism, or any uncredited assistance is not permitted at any stage of the hiring process and may result in disqualification. We appreciate your honesty and look forward to seeing your skills. 
  • We are committed to providing a safe and secure work environment and conduct thorough background checks on all potential employees in accordance with applicable laws and regulations.
  • All emails from the SupplyHouse team will only be sent from an @supplyhouse.com email address. Please exercise caution if you receive an email from an alternate domain.


What is an Employer of Record (EOR)?

Through our partnership with Remote.com, a global Employer of Record (EOR), you can join SupplyHouse from home, while knowing your employment is handled compliantly and securely. Remote takes care of the behind-the-scenes details – like payroll, benefits, taxes, and local compliance – so you can focus on your work and career growth. Even though Remote manages these administrative functions, you’ll be a part of the SupplyHouse team: connected to our culture, collaborating with colleagues, and contributing to our shared success. This partnership allows us to welcome talented team members worldwide while ensuring you receive a best-in-class employee experience.

Read more
Uphance LLC

at Uphance LLC

2 candid answers
Abhishek Shah
Posted by Abhishek Shah
Remote only
6 - 11 yrs
₹30L - ₹40L / yr
skill iconRuby
skill iconRuby on Rails (ROR)
Enterprise Resource Planning (ERP)
Enterprise architecture
skill iconJavascript
+2 more

We seek a highly skilled and experienced Ruby on Rails Development Team Lead/Architect to join our dynamic team at Uphance. The ideal candidate will have proven expertise in leading and architecting RoR projects, focusing on building scalable, high-quality applications. This role requires a combination of technical leadership, mentorship, and a strong commitment to best practices in software development.


Job Type: Contract/Remote/Full-Time/Long-term


Responsibilities:

  • Develop and maintain high-quality Ruby on Rails applications that meet our high-quality standards.
  • Design, build, and maintain efficient, reusable, and reliable Ruby code.
  • Utilise your expertise in Ruby on Rails to enhance the performance and reliability of our platform.
  • Set the technical direction for the existing RoR project, including system architecture and technology stack decisions.
  • Guide and mentor team members to enhance their technical skills and understanding of RoR best practices.
  • Conduct code reviews to maintain high coding standards and ensure adherence to best practices.
  • Optimise application performance, focusing on ActiveRecord queries and overall architecture.
  • Tackle complex technical challenges and provide efficient solutions, particularly when specifications are unclear or incomplete.
  • Establish and enforce testing protocols; write and guide the team in writing effective tests.
  • Define and ensure consistent adherence to best practices, particularly in the context of large applications.
  • Manage the development process using Agile methodologies, possibly acting as a Scrum Master if required.
  • Work closely with product managers, designers, and other stakeholders to meet project requirements and timelines.


Technical Requirements and Qualifications:

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field
  • Proven experience with Ruby on Rails, MySQL, HTML, and JavaScript (6+ years)
  • Extensive experience with Ruby on Rails and familiarity with its best practices
  • Proven track record of technical leadership and team management
  • Strong problem-solving skills and the ability to address issues with incomplete specifications
  • Proficiency in performance optimisation and software testing
  • Experience with Agile development and Scrum practices
  • Excellent mentoring and communication skills
  • Experience with large-scale application development
  • Application performance monitoring/tuning


General Requirements:

  • Availability to work during the IST working hours.
  • High-speed Internet and the ability to join technical video meetings during business hours.
  • Strong analytical and problem-solving skills and ability to work as part of multi-functional teams.
  • Ability to collaborate and be a team player.


Why Uphance?

  • Engage in Innovative Projects: Immerse yourself in cutting-edge projects that not only test your skills but also encourage the exploration of new design realms.
  • AI-Integrated Challenges: Take on projects infused with AI, pushing the boundaries of your abilities and allowing for exploration in uncharted territories of software design and development.
  • Flexible Work Environment: Whether you embrace the digital nomad lifestyle or prefer the comfort of your own space, Uphance provides the freedom to design and create from any corner of the globe.
  • Inclusive Team Environment: Join a dynamic, international, and inclusive team that values and celebrates diverse ideas.
  • Collaborative Team Dynamics: Become a part of a supportive and motivated team that shares in the celebration of collective successes.
  • Recognition and Appreciation: Your accomplishments will be acknowledged and applauded regularly in our Recognition Rally.


Compensation:

Salary Range: INR 24 LPA to INR 32 LPA (Salary is not a constraint for the right candidate)


At Uphance, we value innovation, collaboration, and continuous learning. As part of our team, you'll have the opportunity to lead a group of talented RoR developers, contribute to exciting projects, and play a key role in our company's success. If you are passionate about Ruby on Rails and thrive in a leadership role, we would love to hear from you. Apply today and follow us on LinkedIn - https://www.linkedin.com/company/uphance !

Read more
Cymetrix Software

at Cymetrix Software

2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Remote only
3 - 7 yrs
₹8L - ₹20L / yr
Google Cloud Platform (GCP)
ETL
skill iconPython
Big Data
SQL
+4 more

Must have skills:

1. GCP - GCS, PubSub, Dataflow or DataProc, Bigquery, Airflow/Composer, Python(preferred)/Java

2. ETL on GCP Cloud - Build pipelines (Python/Java) + Scripting, Best Practices, Challenges

3. Knowledge of Batch and Streaming data ingestion, build End to Data pipelines on GCP

4. Knowledge of Databases (SQL, NoSQL), On-Premise and On-Cloud, SQL vs No SQL, Types of No-SQL DB (At least 2 databases)

5. Data Warehouse concepts - Beginner to Intermediate level


Role & Responsibilities:

● Work with business users and other stakeholders to understand business processes.

● Ability to design and implement Dimensional and Fact tables

● Identify and implement data transformation/cleansing requirements

● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data

from various systems to the Enterprise Data Warehouse

● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical

data definitions

● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique

● Provide research, high-level design, and estimates for data transformation and data integration from source

applications to end-user BI solutions.

● Provide production support of ETL processes to ensure timely completion and availability of data in the data

warehouse for reporting use.

● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate,

design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and

data quality.

● Work collaboratively with key stakeholders to translate business information needs into well-defined data

requirements to implement the BI solutions.

● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into

reporting & analytics.

● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.

● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers,

quality issues, and continuously validate reports, dashboards and suggest improvements.

● Train business end-users, IT analysts, and developers.

Read more
Remote only
3 - 5 yrs
₹14L - ₹20L / yr
skill iconReact.js
skill iconRedux/Flux
Mobx
es6
RESTful APIs
+16 more

About Us:

MyOperator is India’s leading Cloud Telephony platform, empowering 40,000+ businesses with smarter communication solutions. We are scaling our engineering team to build high-performing, reliable, and scalable web applications. We’re looking for a React.js Developer with strong expertise in frontend engineering who can take ownership of building pixel-perfect, user-friendly, and performant web applications. Exposure to backend (Node.js) is a plus.


Key Responsibilities

Frontend (React.js – Primary Focus):

  • Build modern, responsive, and high-performance UIs using React.js.
  • Implement state management using Redux, MobX, or similar libraries.
  • Create and optimize React Hooks (inbuilt & custom).
  • Write unit tests to ensure product quality and maintainability.
  • Apply ES6+ features, Webpack, and other modern JS tooling.
  • Diagnose and fix UI/UX performance bottlenecks.
  • Debug and resolve cross-browser compatibility issues.

Backend (Node.js – Secondary):

  • Basic ability to build and integrate RESTful APIs with Node.js.
  • Familiarity with frameworks like Express.js or NestJS.
  • Understanding of authentication, session handling, and caching.

Databases & Tools:

  • Work with SQL databases (mandatory).
  • Exposure to NoSQL databases and ORMs is a plus.
  • Use Git for version control and collaborative coding.

Qualifications

  • 3+ years of professional software development experience.
  • 3+ years of proven experience with React.js.
  • Solid understanding of JavaScript (ES6+), HTML5, CSS3.
  • Strong knowledge of state management, hooks, and UI performance optimization.
  • Good problem-solving skills with a focus on clean, maintainable code.
  • Exposure to Node.js and backend concepts (good to have).

Good to Have

  • Experience with TypeScript.
  • Knowledge of Next.js for server-side rendering.
  • Familiarity with REST APIs and basic backend integration.
  • Strong debugging and browser performance optimization skills.

Why Join Us?

  • Opportunity to specialize in React.js while working on impactful products.
  • Collaborative environment with full ownership of features.
  • Work with cutting-edge frontend technologies at scale.
  • Competitive compensation and career growth opportunities.


Read more
venanalytics

at venanalytics

2 candid answers
Rincy jain
Posted by Rincy jain
Remote only
2 - 4 yrs
₹7L - ₹10L / yr
skill iconPython
SQL
PowerBI
DAX

About Ven Analytics


At Ven Analytics, we don’t just crunch numbers — we decode them to uncover insights that drive real business impact. We’re a data-driven analytics company that partners with high-growth startups and enterprises to build powerful data products, business intelligence systems, and scalable reporting solutions. With a focus on innovation, collaboration, and continuous learning, we empower our teams to solve real-world business problems using the power of data.

Role Overview

We’re looking for a Power BI Data Engineer who is not just proficient in tools but passionate about building insightful, scalable, and high-performing dashboards. The ideal candidate should have strong fundamentals in data modeling, a flair for storytelling through data, and the technical skills to implement robust data solutions using Power BI, Python, and SQL.


Key Responsibilities

  • Technical Expertise: Develop scalable, accurate, and maintainable data models using Power BI, with a clear understanding of Data Modeling, DAX, Power Query, and visualization principles.
  • Programming Proficiency: Use SQL and Python for complex data manipulation, automation, and analysis.
  • Business Problem Translation: Collaborate with stakeholders to convert business problems into structured data-centric solutions considering performance, scalability, and commercial goals.
  • Hypothesis Development: Break down complex use-cases into testable hypotheses and define relevant datasets required for evaluation.
  • Solution Design: Create wireframes, proof-of-concepts (POC), and final dashboards in line with business requirements.
  • Dashboard Quality: Ensure dashboards meet high standards of data accuracy, visual clarity, performance, and support SLAs.
  • Performance Optimization: Continuously enhance user experience by improving performance, maintainability, and scalability of Power BI solutions.
  • Troubleshooting & Support: Quick resolution of access, latency, and data issues as per defined SLAs.


Must-Have Skills

  • Strong experience building robust data models in Power BI
  • Hands-on expertise with DAX (complex measures and calculated columns)
  • Proficiency in M Language (Power Query) beyond drag-and-drop UI
  • Clear understanding of data visualization best practices (less fluff, more insight)
  • Solid grasp of SQL and Python for data processing
  • Strong analytical thinking and ability to craft compelling data stories


Good-to-Have (Bonus Points)

  • Experience using DAX Studio and Tabular Editor
  • Prior work in a high-volume data processing production environment
  • Exposure to modern CI/CD practices or version control with BI tools

 

Why Join Ven Analytics?

  • Be part of a fast-growing startup that puts data at the heart of every decision.
  • Opportunity to work on high-impact, real-world business challenges.
  • Collaborative, transparent, and learning-oriented work environment.
  • Flexible work culture and focus on career development.
Read more
OIP Insurtech

at OIP Insurtech

2 candid answers
Katarina Vasic
Posted by Katarina Vasic
Remote, Hyderabad
3 - 10 yrs
₹7L - ₹14L / yr
Property and casualty insurance
SQL
JSON
API
skill iconXML

We are seeking a Technical Specialist (TS) to join our team. The TS is a hands-on expert in both insurance processes and technical solutions, bridging the gap between business requirements and system performance. In this role, you will be directly responsible for configuring, optimizing, and enhancing underwriting platforms, ensuring that forms, raters, and integrations work seamlessly. You will play a key role in building and maintaining SQL-driven solutions that support clients in the insurtech space.


Key Responsibilities



1. Data Mapping for Forms & Raters

  • Map backend database fields to front-end user interfaces for accurate data presentation.
  • Configure and maintain digital insurance forms (applications, dec pages, endorsements, claims, schedules, coverage summaries).
  • Build and manage rater integrations with underwriting systems using SQL, APIs, XML, and spreadsheet mapping.
  • Validate rating outputs to ensure premiums and calculations follow business logic.

2. SQL Development, Reports & Validations

  • Write, optimize, and maintain SQL queries, stored procedures, and validations.
  • Develop automated reports and outputs (BDX, renewals, loss runs, compliance audits).
  • Build and maintain Quote Covers, ensuring correct rating logic and business rule application.
  • Use SQL for deeper system enhancements, including troubleshooting data flow and optimizing performance.

3. Configuring & Enhancing Underwriting Platforms (C1/Other)

  • Configure system components such as UDTs, templates, invoices/checks, and workflows.
  • Troubleshoot technical issues and optimize performance (e.g., slow reports, broken data flows).
  • Collaborate with underwriting and compliance teams to align system setup with operational needs.

4. API, JSON, and XML Integration

  • Integrate underwriting systems with external APIs (e.g., driving records, carrier services, risk tools).
  • Transform and structure data in JSON/XML for seamless communication between systems.
  • Manage third-party integrations, ensuring accuracy and reliability of connected workflows.

Tools & Technologies

  • SQL & SSMS – advanced development of queries, procedures, reports, and validations.
  • Adobe Acrobat Pro – form design and field mapping.
  • Infomaker (PowerBuilder) – reporting, form development, and data mapping.
  • Mapping Utilities & Excel – for rater and spreadsheet integrations.
  • APIs (REST/SOAP) – JSON and XML for integrations and data exchange.

Collaboration & Workflow

  • Partner with Business Analysts to translate requirements into technical specifications.
  • Work with Quality Assurance to test, validate, and refine outputs.
  • Collaborate with Developers for custom features, but own the majority of SQL- and configuration-driven development.

Qualifications

  • Strong hands-on proficiency in SQL (development-level queries, procedures, validations, and reporting).
  • Proven experience with underwriting systems (ConceptOne or similar).
  • Familiarity with insurance workflows: policy issuance, endorsements, raters, claims, and reporting.
  • Experience with APIs, JSON, XML, and integration practices.
  • Strong problem-solving skills with a focus on system optimization and troubleshooting.
  • Clear communicator, capable of bridging business needs with technical solutions.

Why Join Us?

As a Technical Specialist, you will be more than a system configurator—you will be a developer of clarity and efficiency within underwriting technology. Your SQL expertise and insurance knowledge will directly shape how underwriters, MGAs, and carriers interact with their systems. By optimizing workflows, building reports, and enabling seamless integrations, you’ll help our clients reduce errors, save time, and fully unlock the potential of their platforms.

Read more
Remote only
0 - 1 yrs
₹5000 - ₹7000 / mo
Attention to detail
Troubleshooting
Data modeling
warehousing concepts
Google Cloud Platform (GCP)
+15 more

Springer Capital is a cross-border asset management firm specializing in real estate investment banking between China and the USA. We are offering a remote internship for aspiring data engineers interested in data pipeline development, data integration, and business intelligence. The internship offers flexible start and end dates. A short quiz or technical task may be required as part of the selection process.


Responsibilities:

-Design, build, and maintain scalable data pipelines for

structured and unstructured data sources

-Develop ETL processes to collect, clean, and transform data

from internal and external systems. Support integration of data into

dashboards, analytics tools, and reporting systems

-Collaborate with data analysts and software developers to

improve data accessibility and performance.

-Document workflows and maintain data infrastructure best

practices.

-Assist in identifying opportunities to automate repetitive data

tasks


Please send your resume to talent@springer. capital

Read more
RockED

at RockED

2 candid answers
Kashish Trehan
Posted by Kashish Trehan
Remote only
4 - 9 yrs
₹15L - ₹40L / yr
skill iconNodeJS (Node.js)
MySQL
skill iconJavascript
SQL
skill iconExpress
+3 more

Your Impact

  • Build scalable backend services.
  • Design, implement, and maintain databases, ensuring data integrity, security, and efficient retrieval.
  • Implement the core logic that makes applications work, handling data processing, user requests, and system operations
  • Contribute to the architecture and design of new product features
  • Optimize systems for performance, scalability, and security
  • Stay up-to-date with new technologies and frameworks, contributing to the advancement of software development practices
  • Working closely with product managers and designers to turn ideas into reality and shape the product roadmap.

What skills do you need?


  • 4+ years of experience in backend development, especially building robust APIS using Node.js, Express.js, MYSQL
  • Strong command of JavaScript and understanding of its quirks and best practices
  • Ability to think strategically when designing systems—not just how to build, but why
  • Exposure to system design and interest in building scalable, high-availability systems
  • Prior work on B2C applications with a focus on performance and user experience
  • Ensure that applications can handle increasing loads and maintain performance, even under heavy traffic
  • Work with complex queries for performing sophisticated data manipulation, analysis, and reporting.
  • Knowledge of Sequelize, MongoDB and AWS would be an advantage.
  • Experience in optimizing backend systems for speed and scalability.


Read more
suntekai
Khushi Ash
Posted by Khushi Ash
Remote only
0.6 - 1.6 yrs
₹1.2L - ₹2.6L / yr
Data Visualization
SQL
skill iconPython
Business Intelligence (BI)
skill iconPostgreSQL

Job Description: Data Analyst


About the Role

We are seeking a highly skilled Data Analyst with strong expertise in SQL/PostgreSQL, Python (Pandas), Data Visualization, and Business Intelligence tools to join our team. The candidate will be responsible for analyzing large-scale datasets, identifying trends, generating actionable insights, and supporting business decisions across marketing, sales, operations, and customer experience..

Key Responsibilities

  • Data Extraction & Management

  • Write complex SQL queries in PostgreSQL to extract, clean, and transform large datasets.

  • Ensure accuracy, reliability, and consistency of data across different platforms.

  • Data Analysis & Insights

  • Conduct deep-dive analyses to understand customer behavior, funnel drop-offs, product performance, campaign effectiveness, and sales trends.

  • Perform cohort, LTV (lifetime value), retention, and churn analysis to identify opportunities for growth.

  • Provide recommendations to improve conversion rates, average order value (AOV), and repeat purchase rates.

  • Business Intelligence & Visualization

  • Build and maintain interactive dashboards and reports using BI tools (e.g., PowerBI, Metabase or Looker).

  • Create visualizations that simplify complex datasets for stakeholders and management.

  • Python (Pandas)

  • Use Python (Pandas, NumPy) for advanced analytics.

  • Collaboration & Stakeholder Management

  • Work closely with product, operations, and leadership teams to provide insights that drive decision-making.

  • Communicate findings in a clear, concise, and actionable manner to both technical and non-technical stakeholders.

Required Skills

  • SQL/PostgreSQL

  • Complex joins, window functions, CTEs, aggregations, query optimization.

  • Python (Pandas & Analytics)

  • Data wrangling, cleaning, transformations, exploratory data analysis (EDA).

  • Libraries: Pandas, NumPy, Matplotlib, Seaborn

  • Data Visualization & BI Tools

  • Expertise in creating dashboards and reports using Metabase or Looker.

  • Ability to translate raw data into meaningful visual insights.

  • Business Intelligence

  • Strong analytical reasoning to connect data insights with e-commerce KPIs.

  • Experience in funnel analysis, customer journey mapping, and retention analysis.

  • Analytics & E-commerce Knowledge

  • Understanding of metrics like CAC, ROAS, LTV, churn, contribution margin.

  • General Skills

  • Strong communication and presentation skills.

  • Ability to work cross-functionally in fast-paced environments.

  • Problem-solving mindset with attention to detail.

Education: Bachelor’s degree in Data Science, Computer Science, data processing


Read more
Remote only
0 - 10 yrs
₹3000 - ₹3000 / mo
skill iconJava
skill iconPython
SQL

Backend Engineering Intern (Infrastructure Software) – Remote

Position Type: Internship (Full-Time or Part-Time)

Location: Remote

Duration: 12 weeks

Compensation: Unpaid (***3000 INR is just a placeholder***)

About the Role

We are seeking a motivated Backend Developer Intern to join our engineering team and contribute to building scalable, efficient, and secure backend services. This internship offers hands-on experience in API development, database management, and backend architecture, with guidance from experienced developers. You will work closely with cross-functional teams to deliver features that power our applications and improve user experience.

Responsibilities

  • Assist in designing, developing, and maintaining backend services, APIs, and integrations.
  • Collaborate with frontend engineers to support application functionality and data flow.
  • Write clean, efficient, and well-documented code.
  • Support database design, optimization, and query performance improvements.
  • Participate in code reviews, debugging, and troubleshooting production issues.
  • Assist with unit testing, integration testing, and ensuring system reliability.
  • Work with cloud-based environments (e.g., AWS, Azure, GCP) to deploy and manage backend systems.

Requirements

  • Currently pursuing or recently completed a degree in Computer Science, Software Engineering, or related field.
  • Familiarity with one or more backend languages/frameworks (e.g., Node.js, Python/Django, Java/Spring Boot, Ruby on Rails).
  • Understanding of RESTful APIs and/or GraphQL.
  • Basic knowledge of relational and/or NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB).
  • Familiarity with version control (Git/GitHub).
  • Strong problem-solving skills and attention to detail.
  • Ability to work independently in a remote, collaborative environment.

Preferred Skills (Nice to Have)

  • Experience with cloud services (AWS Lambda, S3, EC2, etc.).
  • Familiarity with containerization (Docker) and CI/CD pipelines.
  • Basic understanding of authentication and authorization (OAuth, JWT).
  • Interest in backend performance optimization and scalability.

What You’ll Gain

  • Hands-on experience building backend systems for real-world applications.
  • Exposure to industry-standard tools, workflows, and coding practices.
  • Mentorship from experienced backend engineers.
  • Opportunity to contribute to live projects impacting end users.
Read more
InEvolution
Remote only
2 - 3 yrs
₹5L - ₹7L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+7 more

About the Role:


We are on the lookout for a dynamic Marketing Automation and Data Analytics Specialist, someone who is not only adept in marketing automation/operation but also possesses a keen expertise in data analytics and visualization. This role is tailor-made for individuals who are proficient with tools like Eloqua, Marketo, Salesforce Pardot, and Power BI.


As our Marketing Automation and Data Analytics Specialist, your responsibilities will span across managing and optimizing marketing automation systems and overseeing the migration and enhancement of data systems and dashboards. You will play a pivotal role in blending marketing strategies with data analytics, ensuring the creation of visually appealing and effective reports and dashboards. Collaborating closely with marketing teams, you will help in making data-driven decisions that propel the company forward.


We believe in fostering an environment where initiative and self-direction are valued. While you will receive the necessary guidance and support, the autonomy of your role is a testament to our trust in your abilities and professionalism.


Responsibilities:


  • Manage and optimize marketing automation systems (Eloqua, Marketo, Salesforce Pardot) to map and improve business processes.
  • Develop, audit, and enhance data systems, ensuring accuracy and efficiency in marketing efforts.
  • Build and migrate interactive, visually appealing dashboards and reports.
  • Develop and maintain reporting and analytics for marketing efforts, database health, lead scoring, and dashboard performance.
  • Handle technical aspects of key marketing systems and integrate them with data visualization tools like Power BI.
  • Review and improve existing SQL data sources for effective integration and analytics.
  • Collaborate closely with sales, marketing, and analytics teams to define requirements, establish best practices, and ensure successful outcomes.
  • Ensure all marketing data, dashboards, and reports are accurate and effectively meet business needs.


Ideal Candidate Qualities:


  • Strong commitment to the role with a focus on long-term growth.
  • Exceptional communication and collaboration skills across diverse teams.
  • High degree of autonomy and ability to work effectively without micromanagement.
  • Strong attention to detail and organization skills.


Qualifications:


  • Hands-on experience with marketing automation systems and data analytics tools like Eloqua, Marketo, Salesforce Pardot and Power Bi .
  • Proven experience in data visualization and dashboard creation using Power BI.
  • Experience with SQL, including building and optimizing queries.
  • Knowledge of ABM and Intent Signaling technologies is a plus.
  • Outstanding analytical skills with an ability to work with complex datasets.
  • Familiarity with data collection, cleaning, and transformation processes.


Benefits:


  • Work-from-home flexibility.
  • Career advancement opportunities and professional development support.
  • Supportive and collaborative team environment.


Hiring Process:


The hiring process at InEvolution is thoughtfully designed to ensure alignment between your career goals and our company's objectives. The process will include:


  • Initial Phone Screening: A brief conversation to discuss your background and understand your career aspirations.
  • Team Introduction Interview: Candidates who excel in the first round will engage in a deeper discussion with our team, providing insights into our work culture and the specificities of the role.
  • Technical Assessment: In the final round, you will meet our Technical Director for an in-depth conversation about your technical skills and how these align with the demands of the role.


Read more
InEvolution

at InEvolution

2 candid answers
Pavan P K
Posted by Pavan P K
Remote only
0 - 1 yrs
₹10000 - ₹15000 / mo
Email Marketing
Mailchimp
skill iconHTML/CSS
Reporting
SQL

 About the Role


We are looking for a motivated and detail-oriented Email Marketing Intern to join our marketing team. This is a hands-on opportunity to work with tools like MailChimp and Iterable, and gain real-world experience in executing and analyzing email campaigns. Ideal for someone with a background in marketing and prior internship experience in a similar field.


Key Responsibilities:


  • Support the execution of email marketing campaigns using MailChimp and Iterable.
  • Assist in segmenting audiences and setting up batch, nurture, and trigger-based campaigns.
  • Collaborate with the analytics team to help track campaign performance and contribute to reporting.
  • Work with cross-functional teams including Demand Gen, Product Marketing, and Marketing Operations to implement email marketing best practices.


What We're Looking For:


  • Educational background in Marketing, Communications, or related fields.
  • Prior internship or project experience in email marketing, CRM, or digital campaigns is a strong plus.
  • Familiarity with email marketing tools like MailChimp or Iterable.
  • Basic knowledge of HTML and content management systems.
  • Ability to handle reporting for Email Marketing and General Ecommerce Marketing.
  • Curiosity, attention to detail, and a willingness to learn.



Nice to Have (Not Mandatory)


  • Experience creating performance or campaign reports.
  • Exposure to SQL, CSS, or HTML for email customization
Read more
Springer Capital
Remote only
0 - 1 yrs
₹5000 - ₹7000 / mo
PowerBI
Microsoft Excel
SQL
Attention to detail
Troubleshooting
+13 more

Springer Capital is a cross-border asset management firm specializing in real estate investment banking between China and the USA. We are offering a remote internship for aspiring data engineers interested in data pipeline development, data integration, and business intelligence.

The internship offers flexible start and end dates. A short quiz or technical task may be required as part of the selection process.

Responsibilities:

  • Design, build, and maintain scalable data pipelines for structured and unstructured data sources
  • Develop ETL processes to collect, clean, and transform data from internal and external systems
  • Support integration of data into dashboards, analytics tools, and reporting systems
  • Collaborate with data analysts and software developers to improve data accessibility and performance
  • Document workflows and maintain data infrastructure best practices
  • Assist in identifying opportunities to automate repetitive data tasks


Read more
Ekloud INC
Remote only
6 - 14 yrs
₹18L - ₹22L / yr
Salesforce
Test Automation (QA)
Automation
Manual testing
SOQL
+9 more

Job description:

 

6+ years of hands on experience with both Manual and Automated testing with strong preference of experience using AccelQ on Salesforce ans SAP platforms.

 

Proven expertise in Salesforce particularly within the Sales Cloud module.

 

Proficient in writing complex SOQL and SQL queries for data validation and backend testing.

 

Extensive experience in designing and developing robust, reusable automated test scripts for Salesforce environments.

 

Highly skilled at early issue detection, with a deep understanding of backend configurations, process flows and validation rules.

 

Should have a strong background in Salesforce testing, with hands-on experience in automation tools such as Selenium, Provar, or TestNG. 


You will be responsible for creating and maintaining automated test scripts, executing test cases, identifying bugs, and ensuring the quality and reliability of Salesforce applications. 

 

A solid understanding of Salesforce modules (Sales Cloud, Service Cloud, etc.) and APIs is essential. 

 

Experience with CI/CD tools like Jenkins and version control systems like Git is preferred. 

 

You will work closely with developers, business analysts, and stakeholders to define test strategies and improve the overall QA process. 

Read more
Springer Capital
Andrew Rose
Posted by Andrew Rose
Remote only
0 - 1 yrs
₹5000 - ₹7000 / mo
Attention to detail
Troubleshooting
Data modeling
Warehousing concepts
Google Cloud Platform (GCP)
+15 more

Springer Capital is a cross-border asset management firm specializing in real estate investment banking between China and the USA. We are offering a remote internship for aspiring data engineers interested in data pipeline development, data integration, and business intelligence. The internship offers flexible start and end dates. A short quiz or technical task may be required as part of the selection process. 

 

Responsibilities: 

▪ Design, build, and maintain scalable data pipelines for structured and unstructured data sources 

▪ Develop ETL processes to collect, clean, and transform data from internal and external systems 

▪ Support integration of data into dashboards, analytics tools, and reporting systems 

▪ Collaborate with data analysts and software developers to improve data accessibility and performance 

▪ Document workflows and maintain data infrastructure best practices 

▪ Assist in identifying opportunities to automate repetitive data tasks 

Read more
venanalytics

at venanalytics

2 candid answers
Rincy jain
Posted by Rincy jain
Remote only
2 - 4 yrs
₹5L - ₹12L / yr
skill iconDjango
skill iconPython
SQL
skill iconReact.js
skill iconHTML/CSS

Role Objective

 

Develop business relevant, high quality, scalable web applications. You will be part of a dynamic AdTech team solving big problems in the Media and Entertainment Sector.

 

Roles & Responsibilities

 

* Application Design: Understand requirements from the user, create stories and be a part of the design team. Check designs, give regular feedback and ensure that the designs are as per user expectations.  

 

* Architecture: Create scalable and robust system architecture. The design should be in line with the client infra. This could be on-prem or cloud (Azure, AWS or GCP). 

 

* Development: You will be responsible for the development of the front-end and back-end. The application stack will comprise of (depending on the project) SQL, Django, Angular/React, HTML, CSS. Knowledge of GoLang and Big Data is a plus point.  

 

* Deployment: Suggest and implement a deployment strategy that is scalable and cost-effective. Create a detailed resource architecture and get it approved. CI/CD deployment on IIS or Linux. Knowledge of dockers is a plus point.  

 

* Maintenance: Maintaining development and production environments will be a key part of your job profile. This will also include trouble shooting, fixing bugs and suggesting ways for improving the application. 

 

* Data Migration: In the case of database migration, you will be expected to suggest appropriate strategies and implementation plans. 

 

* Documentation: Create a detailed document covering important aspects like HLD, Technical Diagram, Script Design, SOP etc. 

 

* Client Interaction: You will be interacting with the client on a day-to-day basis and hence having good communication skills is a must. 

 

Requirements

 

Education-B. Tech (Comp. Sc, IT) or equivalent 

 

Experience- 3+ years of experience developing applications on Django, Angular/React, HTML and CSS 

 

Behavioural Skills-

  1. Clear and Assertive communication 

  2. Ability to comprehend the business requirement  

  3. Teamwork and collaboration 

  4. Analytics thinking 

  5. Time Management 

  6. Strong troubleshooting and problem-solving skills 

 

Technical Skills-

  1. Back-end and Front-end Technologies: Django, Angular/React, HTML and CSS. 

  2. Cloud Technologies: AWS, GCP, and Azure 

  3. Big Data Technologies: Hadoop and Spark 

  4. Containerized Deployment: Dockers and Kubernetes is a plus.

  5. Other: Understanding of Golang is a plus.

 

Read more
IT services and consulting

IT services and consulting

Agency job
via Myhashtaggs by Ravikanth Dangeti
Remote only
4 - 8 yrs
₹10L - ₹12L / yr
skill iconMachine Learning (ML)
skill iconDeep Learning
Algorithms
skill iconPython
PowerBI
+5 more



Remote Job Opportunity

Job Title: Data Scientist

Contract Duration: 6 months+

Location: offshore India


Work Time: 3 pm to 12 am


Must have 4+ Years of relevant experience.


Job Summary:

We are seeking an AI Data Scientist with a strong foundation in machine learning, deep learning, and statistical modeling to design, develop, and deploy cutting-edge AI solutions.

The ideal candidate will have expertise in building and optimizing AI models, with a deep understanding of both statistical theory and modern AI techniques. You will work on high-impact projects, from prototyping to production, collaborating with engineers, researchers, and business stakeholders to solve complex problems using AI.


Key Responsibilities:

Research, design, and implement machine learning and deep learning models for predictive and generative AI applications.

Apply advanced statistical methods to improve model robustness and interpretability.

Optimize model performance through hyperparameter tuning, feature engineering, and ensemble techniques.

Perform large-scale data analysis to identify patterns, biases, and opportunities for AI-driven automation.

Work closely with ML engineers to validate, train, and deploy the models.

Stay updated with the latest research and developments in AI and machine learning to ensure innovative and cutting-edge solutions.


Qualifications & Skills:

Education: PhD or Master's degree in Statistics, Mathematics, Computer Science, or a related field.


Experience:

4+ years of experience in machine learning and deep learning, with expertise in algorithm development and optimization.

Proficiency in SQL, Python and visualization tools ( Power BI).

Experience in developing mathematical models for business applications, preferably in finance, trading, image-based AI, biomedical modeling, or recommender systems industries

Strong communication skills to interact effectively with both technical and non-technical stakeholders.

Excellent problem-solving skills with the ability to work independently and as part of a team.

Read more
Ekloud INC
Ankita G
Posted by Ankita G
Remote only
6 - 9 yrs
₹18L - ₹25L / yr
Salesforce
sales cloud
accelq
Test Automation (QA)
Automated testing
+2 more

Job Title: Salesforce QA Engineer

 

Experience: 6+ Years

 

Location: Bangalore - Hybrid (Manyata Tech Park)

 

Job description:

 

6+ years of hands on experience with both Manual and Automated testing with strong preference of experience using AccelQ on Salesforce ans SAP platforms.

 

Proven expertise in Salesforce particularly within the Sales Cloud module.

 

Proficient in writing complex SOQL and SQL queries for data validation and backend testing.

 

Extensive experience in designing and developing robust, reusable automated test scripts for Salesforce environments.

 

Highly skilled at early issue detection, with a deep understanding of backend configurations, process flows and validation rules.

 

Should have a strong background in Salesforce testing, with hands-on experience in automation tools such as Selenium, Provar, or TestNG. 

You will be responsible for creating and maintaining automated test scripts, executing test cases, identifying bugs, and ensuring the quality and reliability of Salesforce applications. 

 

A solid understanding of Salesforce modules (Sales Cloud, Service Cloud, etc.) and APIs is essential. 

 

Experience with CI/CD tools like Jenkins and version control systems like Git is preferred. 

 

You will work closely with developers, business analysts, and stakeholders to define test strategies and improve the overall QA process. 

Read more
Fountane inc
HR Fountane
Posted by HR Fountane
Remote only
5 - 9 yrs
₹18L - ₹32L / yr
skill iconAmazon Web Services (AWS)
AWS Lambda
AWS CloudFormation
ETL
skill iconDocker
+3 more

Position Overview: We are looking for an experienced and highly skilled Senior Data Engineer to join our team and help design, implement, and optimize data systems that support high-end analytical solutions for our clients. As a customer-centric Data Engineer, you will work closely with clients to understand their business needs and translate them into robust, scalable, and efficient technical solutions. You will be responsible for end-to-end data modelling, integration workflows, and data transformation processes while ensuring security, privacy, and compliance.In this role, you will also leverage the latest advancements in artificial intelligence, machine learning, and large language models (LLMs) to deliver high-impact solutions that drive business success. The ideal candidate will have a deep understanding of data infrastructure, optimization techniques, and cost-effective data management


Key Responsibilities:


• Customer Collaboration:

– Partner with clients to gather and understand their business

requirements, translating them into actionable technical specifications.

– Act as the primary technical consultant to guide clients through data challenges and deliver tailored solutions that drive value.


•Data Modeling & Integration:

– Design and implement scalable, efficient, and optimized data models to support business operations and analytical needs.

– Develop and maintain data integration workflows to seamlessly extract, transform, and load (ETL) data from various sources into data repositories.

– Ensure smooth integration between multiple data sources and platforms, including cloud and on-premise systems


• Data Processing & Optimization:

– Develop, optimize, and manage data processing pipelines to enable real-time and batch data processing at scale.

– Continuously evaluate and improve data processing performance, optimizing for throughput while minimizing infrastructure costs.


• Data Governance & Security:

–Implement and enforce data governance policies and best practices, ensuring data security, privacy, and compliance with relevant industry regulations (e.g., GDPR, HIPAA).

–Collaborate with security teams to safeguard sensitive data and maintain privacy controls across data environments.


• Cross-Functional Collaboration:

– Work closely with data engineers, data scientists, and business

analysts to ensure that the data architecture aligns with organizational objectives and delivers actionable insights.

– Foster collaboration across teams to streamline data workflows and optimize solution delivery.


• Leveraging Advanced Technologies:

– Utilize AI, machine learning models, and large language models (LLMs) to automate processes, accelerate delivery, and provide

smart, data-driven solutions to business challenges.

– Identify opportunities to apply cutting-edge technologies to improve the efficiency, speed, and quality of data processing and analytics.


• Cost Optimization:

–Proactively manage infrastructure and cloud resources to optimize throughput while minimizing operational costs.

–Make data-driven recommendations to reduce infrastructure overhead and increase efficiency without sacrificing performance.


Qualifications:


• Experience:

– Proven experience (5+ years) as a Data Engineer or similar role, designing and implementing data solutions at scale.

– Strong expertise in data modelling, data integration (ETL), and data transformation processes.

– Experience with cloud platforms (AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark).


• Technical Skills:

– Advanced proficiency in SQL, data modelling tools (e.g., Erwin,PowerDesigner), and data integration frameworks (e.g., Apache

NiFi, Talend).

– Strong understanding of data security protocols, privacy regulations, and compliance requirements.

– Experience with data storage solutions (e.g., data lakes, data warehouses, NoSQL, relational databases).


• AI & Machine Learning Exposure:

– Familiarity with leveraging AI and machine learning technologies (e.g., TensorFlow, PyTorch, scikit-learn) to optimize data processing and analytical tasks.

–Ability to apply advanced algorithms and automation techniques to improve business processes.


• Soft Skills:

– Excellent communication skills to collaborate with clients, stakeholders, and cross-functional teams.

– Strong problem-solving ability with a customer-centric approach to solution design.

– Ability to translate complex technical concepts into clear, understandable terms for non-technical audiences.


• Education:

– Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or a related field (or equivalent practical experience).


LIFE AT FOUNTANE:

  • Fountane offers an environment where all members are supported, challenged, recognized & given opportunities to grow to their fullest potential.
  • Competitive pay
  • Health insurance for spouses, kids, and parents.
  • PF/ESI or equivalent
  • Individual/team bonuses
  • Employee stock ownership plan
  • Fun/challenging variety of projects/industries
  • Flexible workplace policy - remote/physical
  • Flat organization - no micromanagement
  • Individual contribution - set your deadlines
  • Above all - culture that helps you grow exponentially!


A LITTLE BIT ABOUT THE COMPANY:

Established in 2017, Fountane Inc is a Ventures Lab incubating and investing in new competitive technology businesses from scratch. Thus far, we’ve created half a dozen multi-million valuation companies in the US and a handful of sister ventures for large corporations, including Target, US Ventures, and Imprint Engine.

We’re a team of 120+ strong from around the world that are radically open-minded and believes in excellence, respecting one another, and pushing our boundaries to the furthest it's ever been.

Read more
KGISL MICROCOLLEGE
Agency job
via EDU TECH by Srimathi Balamurugan
Remote, Kochi (Cochin)
1 - 5 yrs
₹2L - ₹6L / yr
Business Analysis
SQL
MS-Excel
Tableau
PowerBI

We are looking for a passionate and experienced Business Analyst Trainer to join our training team. This role involves delivering high-quality training programs on business analysis tools, methodologies, and best practices, both in-person and online.

Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Remote only
10 - 15 yrs
₹10L - ₹18L / yr
Solution architecture
Denodo
Data Virtualization
Data architecture
SQL
+5 more

Job Title : Solution Architect – Denodo

Experience : 10+ Years

Location : Remote / Work from Home

Notice Period : Immediate joiners preferred


Job Overview :

We are looking for an experienced Solution Architect – Denodo to lead the design and implementation of data virtualization solutions. In this role, you will work closely with cross-functional teams to ensure our data architecture aligns with strategic business goals. The ideal candidate will bring deep expertise in Denodo, strong technical leadership, and a passion for driving data-driven decisions.


Mandatory Skills : Denodo, Data Virtualization, Data Architecture, SQL, Data Modeling, ETL, Data Integration, Performance Optimization, Communication Skills.


Key Responsibilities :

  • Architect and design scalable data virtualization solutions using Denodo.
  • Collaborate with business analysts and engineering teams to understand requirements and define technical specifications.
  • Ensure adherence to best practices in data governance, performance, and security.
  • Integrate Denodo with diverse data sources and optimize system performance.
  • Mentor and train team members on Denodo platform capabilities.
  • Lead tool evaluations and recommend suitable data integration technologies.
  • Stay updated with emerging trends in data virtualization and integration.

Required Qualifications :

  • Bachelor’s degree in Computer Science, IT, or a related field.
  • 10+ Years of experience in data architecture and integration.
  • Proven expertise in Denodo and data virtualization frameworks.
  • Strong proficiency in SQL and data modeling.
  • Hands-on experience with ETL processes and data integration tools.
  • Excellent communication, presentation, and stakeholder management skills.
  • Ability to lead technical discussions and influence architectural decisions.
  • Denodo or data architecture certifications are a strong plus.
Read more
Remote only
4 - 6 yrs
₹10L - ₹15L / yr
skill iconAngular (2+)
skill icon.NET
SQL
Relational Database (RDBMS)
Dependency injection

.NET + Angular Full Stack Developer (4–5 Years Experience)

Location: Pune/Remote

Experience Required: 4 to 5 years

Communication: Fluent English (verbal & written)

Technology: .NET, Angular

Only immediate joiners who can start on 21st July should apply.


Job Overview

We are seeking a skilled and experienced Full Stack Developer with strong expertise in .NET (C#) and Angular to join our dynamic team in Pune. The ideal candidate will have hands-on experience across the full development stack, a strong understanding of relational databases and SQL, and the ability to work independently with clients. Experience in microservices architecture is a plus.


Key Responsibilities

  • Design, develop, and maintain modern web applications using .NET Core / .NET Framework and Angular
  • Write clean, scalable, and maintainable code for both backend and frontend components
  • Interact directly with clients for requirement gathering, demos, sprint planning, and issue resolution
  • Work closely with designers, QA, and other developers to ensure high-quality product delivery
  • Perform regular code reviews, ensure adherence to coding standards, and mentor junior developers if needed
  • Troubleshoot and debug application issues and provide timely solutions
  • Participate in discussions on architecture, design patterns, and technical best practices

Must-Have Skills

✅ Strong hands-on experience with .NET Core / .NET Framework (Web API, MVC)

✅ Proficiency in Angular (Component-based architecture, RxJS, State Management)

✅ Solid understanding of RDBMS and SQL (preferably with SQL Server)

✅ Familiarity with Entity Framework or Dapper

✅ Strong knowledge of RESTful API design and integration

✅ Version control using Git

✅ Excellent verbal and written communication skills

✅ Ability to work in a client-facing role and handle discussions independently

Good-to-Have / Optional Skills

Understanding or experience in Microservices Architecture

Exposure to CI/CD pipelines, unit testing frameworks, and cloud environments (e.g., Azure or AWS)

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shrutika SaileshKumar
Posted by Shrutika SaileshKumar
Remote, Bengaluru (Bangalore)
5 - 9 yrs
Best in industry
skill iconPython
SDET
BDD
SQL
Data Warehouse (DWH)
+2 more

Primary skill set: QA Automation, Python, BDD, SQL 

As Senior Data Quality Engineer you will:

  • Evaluate product functionality and create test strategies and test cases to assess product quality.
  • Work closely with the on-shore and the offshore team.
  • Work on multiple reports validation against the databases by running medium to complex SQL queries.
  • Better understanding of Automation Objects and Integrations across various platforms/applications etc.
  • Individual contributor exploring opportunities to improve performance and suggest/articulate the areas of improvements importance/advantages to management.
  • Integrate with SCM infrastructure to establish a continuous build and test cycle using CICD tools.
  • Comfortable working on Linux/Windows environment(s) and Hybrid infrastructure models hosted on Cloud platforms.
  • Establish processes and tools set to maintain automation scripts and generate regular test reports.
  • Peer review to provide feedback and to make sure the test scripts are flaw-less.

Core/Must have skills:

  • Excellent understanding and hands on experience in ETL/DWH testing preferably DataBricks paired with Python experience.
  • Hands on experience SQL (Analytical Functions and complex queries) along with knowledge of using SQL client utilities effectively.
  • Clear & crisp communication and commitment towards deliverables
  • Experience on BigData Testing will be an added advantage.
  • Knowledge on Spark and Scala, Hive/Impala, Python will be an added advantage.

Good to have skills:

  • Test automation using BDD/Cucumber / TestNG combined with strong hands-on experience with Java with Selenium. Especially working experience in WebDriver.IO
  • Ability to effectively articulate technical challenges and solutions
  • Work experience in qTest, Jira, WebDriver.IO


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Remote only
5 - 10 yrs
₹10L - ₹22L / yr
Business Analysis
Healthcare
Requirements management
User stories
Gap analysis
+11 more

Position : Business Analyst

Experience : 5+ Years

Location : Remote

Notice Period : Immediate Joiners Preferred (or candidates serving 10–15 days’ notice)

Interview Mode : Virtual


Job Description :

We are seeking an experienced Business Analyst with a strong background in requirements gathering, functional documentation, and stakeholder management, particularly in the US Healthcare payer domain.


Mandatory Skills :

Business Analysis, US Healthcare Payer Domain, Requirement Gathering, User Stories, Gap & Impact Analysis, Azure DevOps/TFS, SQL, UML Modeling, SDLC/STLC, System Testing, UAT, Strong Communication Skills.


Key Responsibilities :

  • Analyze and understand complex business and functional requirements.
  • Translate business needs into detailed User Stories, functional and technical specifications.
  • Conduct gap analysis and impact assessment for new and existing product features.
  • Create detailed documentation including scope, project plans, and secure stakeholder approvals.
  • Support System Testing and User Acceptance Testing (UAT) from a functional perspective.
  • Prepare and maintain release notes, end-user documentation, training materials, and process flows.
  • Serve as a liaison between business and technical teams, ensuring cross-functional alignment.
  • Assist with sprint planning, user story tracking, and status updates using Azure DevOps / TFS.
  • Write and execute basic SQL queries for data validation and analysis.

Required Skills :

  • Minimum 5 years of experience as a Business Analyst.
  • Strong analytical, problem-solving, and communication skills.
  • Solid understanding of Project Life Cycle, STLC, and UML modeling.
  • Prior experience in US Healthcare payer domain is mandatory.
  • Familiarity with tools like Azure DevOps / TFS.
  • Ability to work with urgency, manage priorities, and maintain attention to detail.
  • Strong team collaboration and stakeholder management.
Read more
Parksmart
Agency job
via Parksmart by Saurav Kumar
Remote, Noida
0 - 1 yrs
₹10000 - ₹15000 / mo
skill iconNodeJS (Node.js)
skill iconAmazon Web Services (AWS)
skill iconReact.js
SQL
skill iconMongoDB
+1 more


🚀 We're Urgently Hiring – Node.js Backend Development Intern

Join our backend team as an intern and get hands-on experience building scalable, real-world applications with Node.js, Firebase, and AWS.

📍 Remote / Onsite

📍 📅 Duration: 2 Months


🔧 What You’ll Work On:

Backend development using Node.js

Firebase, SQL & NoSQL database management

RESTful API integration

Deployment on AWS infrastructure


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Remote, Kochi (Cochin), Trivandrum
8 - 15 yrs
₹10L - ₹24L / yr
skill iconJava
skill iconSpring Boot
skill iconPython
skill iconAngular (2+)
skill iconAmazon Web Services (AWS)
+7 more

Job Title : Technical Architect

Experience : 8 to 12+ Years

Location : Trivandrum / Kochi / Remote

Work Mode : Remote flexibility available

Notice Period : Immediate to max 15 days (30 days with negotiation possible)


Summary :

We are looking for a highly skilled Technical Architect with expertise in Java Full Stack development, cloud architecture, and modern frontend frameworks (Angular). This is a client-facing, hands-on leadership role, ideal for technologists who enjoy designing scalable, high-performance, cloud-native enterprise solutions.


🛠 Key Responsibilities :

  • Architect scalable and high-performance enterprise applications.
  • Hands-on involvement in system design, development, and deployment.
  • Guide and mentor development teams in architecture and best practices.
  • Collaborate with stakeholders and clients to gather and refine requirements.
  • Evaluate tools, processes, and drive strategic technical decisions.
  • Design microservices-based solutions deployed over cloud platforms (AWS/Azure/GCP).

Mandatory Skills :

  • Backend : Java, Spring Boot, Python
  • Frontend : Angular (at least 2 years of recent hands-on experience)
  • Cloud : AWS / Azure / GCP
  • Architecture : Microservices, EAI, MVC, Enterprise Design Patterns
  • Data : SQL / NoSQL, Data Modeling
  • Other : Client handling, team mentoring, strong communication skills

Nice to Have Skills :

  • Mobile technologies (Native / Hybrid / Cross-platform)
  • DevOps & Docker-based deployment
  • Application Security (OWASP, PCI DSS)
  • TOGAF familiarity
  • Test-Driven Development (TDD)
  • Analytics / BI / ML / AI exposure
  • Domain knowledge in Financial Services or Payments
  • 3rd-party integration tools (e.g., MuleSoft, BizTalk)

⚠️ Important Notes :

  • Only candidates from outside Hyderabad/Telangana and non-JNTU graduates will be considered.
  • Candidates must be serving notice or joinable within 30 days.
  • Client-facing experience is mandatory.
  • Java Full Stack candidates are highly preferred.

🧭 Interview Process :

  1. Technical Assessment
  2. Two Rounds – Technical Interviews
  3. Final Round
Read more
Hunarstreet Technologies pvt ltd

Hunarstreet Technologies pvt ltd

Agency job
via Hunarstreet Technologies pvt ltd by Sakshi Patankar
Remote only
10 - 20 yrs
₹15L - ₹30L / yr
Data engineering
databricks
skill iconPython
skill iconScala
Spark
+14 more

What You’ll Be Doing:

● Design and build parts of our data pipeline architecture for extraction, transformation, and loading of data from a wide variety of data sources using the latest Big Data technologies.

● Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

● Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

● Work with machine learning, data, and analytics experts to drive innovation, accuracy and greater functionality in our data system. Qualifications:

● Bachelor's degree in Engineering, Computer Science, or relevant field.

● 10+ years of relevant and recent experience in a Data Engineer role. ● 5+ years recent experience with Apache Spark and solid understanding of the fundamentals.

● Deep understanding of Big Data concepts and distributed systems.

● Strong coding skills with Scala, Python, Java and/or other languages and the ability to quickly switch between them with ease.

● Advanced working SQL knowledge and experience working with a variety of relational databases such as Postgres and/or MySQL.

● Cloud Experience with DataBricks

● Experience working with data stored in many formats including Delta Tables, Parquet, CSV and JSON.

● Comfortable working in a linux shell environment and writing scripts as needed.

● Comfortable working in an Agile environment

● Machine Learning knowledge is a plus.

● Must be capable of working independently and delivering stable, efficient and reliable software.

● Excellent written and verbal communication skills in English.

● Experience supporting and working with cross-functional teams in a dynamic environment


EMPLOYMENT TYPE: Full-Time, Permanent

LOCATION: Remote (Pan India)

SHIFT TIMINGS: 2.00 pm-11:00pm IST 

Read more
Automate Accounts

at Automate Accounts

2 candid answers
Namrata Das
Posted by Namrata Das
Remote only
4 - 10 yrs
₹10L - ₹25L / yr
skill iconPython
SQL
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconGitHub
+2 more

Responsibilities


Develop and maintain web and backend components using Python, Node.js, and Zoho tools


Design and implement custom workflows and automations in Zoho


Perform code reviews to maintain quality standards and best practices


Debug and resolve technical issues promptly


Collaborate with teams to gather and analyze requirements for effective solutions


Write clean, maintainable, and well-documented code


Manage and optimize databases to support changing business needs


Contribute individually while mentoring and supporting team members


Adapt quickly to a fast-paced environment and meet expectations within the first month



Leadership Opportunities


Lead and mentor junior developers in the team


Drive projects independently while collaborating with the broader team


Act as a technical liaison between the team and stakeholders to deliver effective solutions



Selection Process


1. HR Screening: Review of qualifications and experience


2. Online Technical Assessment: Test coding and problem-solving skills


3. Technical Interview: Assess expertise in web development, Python, Node.js, APIs, and Zoho


4. Leadership Evaluation: Evaluate team collaboration and leadership abilities


5. Management Interview: Discuss cultural fit and career opportunities


6. Offer Discussion: Finalize compensation and role specifics



Experience Required


5- 7 years of relevant experience as a Software Developer


Proven ability to work as a self-starter and contribute individually


Strong technical and interpersonal skills to support team members effectively

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort