Cutshort logo

50+ SQL Jobs in India

Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
The Alter Office

at The Alter Office

2 candid answers
Harsha Ravindran
Posted by Harsha Ravindran
Bengaluru (Bangalore)
3 - 6 yrs
₹12L - ₹18L / yr
skill iconNodeJS (Node.js)
MySQL
NOSQL Databases
skill iconMongoDB
Google Cloud Platform (GCP)
+14 more

Role: Senior Software Engineer - Backend

Location: In-Office, Bangalore, Karnataka, India

 

Job Summary:

We are seeking a highly skilled and experienced Senior Backend Engineer with a minimum of 3 years of experience in product building to join our dynamic and innovative team. In this role, you will be responsible for designing, developing, and maintaining robust backend systems that power our applications. You will work closely with cross-functional teams to ensure seamless integration between frontend and backend components, leveraging your expertise to architect scalable, secure, and high-performance solutions. As a senior team member, you will mentor junior developers and lead technical initiatives to drive innovation and excellence.

 

Annual Compensation: 12-18 LPA


Responsibilities:

  • Lead the design, development, and maintenance of scalable and efficient backend systems and APIs.
  • Architect and implement complex backend solutions, ensuring high availability and performance.
  • Collaborate with product managers, frontend developers, and other stakeholders to deliver comprehensive end-to-end solutions.
  • Design and optimize data storage solutions using relational databases and NoSQL databases.
  • Mentor and guide junior developers, fostering a culture of knowledge sharing and continuous improvement.
  • Implement and enforce best practices for code quality, security, and performance optimization.
  • Develop and maintain CI/CD pipelines to automate build, test, and deployment processes.
  • Ensure comprehensive test coverage, including unit testing, and implement various testing methodologies and tools to validate application functionality.
  • Utilize cloud services (e.g., AWS, Azure, GCP) for infrastructure deployment, management, and optimization.
  • Conduct system design reviews and provide technical leadership in architectural discussions.
  • Stay updated with industry trends and emerging technologies to drive innovation within the team.
  • Implement secure authentication and authorization mechanisms and ensure data encryption for sensitive information.
  • Design and develop event-driven applications utilizing serverless computing principles to enhance scalability and efficiency.

Requirements:

  • Minimum of 3 years of proven experience as a Backend Engineer, with a strong portfolio of product-building projects.
  • Strong proficiency in backend development using Java, Python, and JavaScript, with experience in building scalable and high-performance applications.
  • Experience with popular backend frameworks and libraries for Java (e.g., Spring Boot) and Python (e.g., Django, Flask).
  • Strong expertise in SQL and NoSQL databases (e.g., MySQL, MongoDB) with a focus on data modeling and scalability.
  • Practical experience with caching mechanisms (e.g., Redis) to enhance application performance.
  • Proficient in RESTful API design and development, with a strong understanding of API security best practices.
  • In-depth knowledge of asynchronous programming and event-driven architecture.
  • Familiarity with the entire web stack, including protocols, web server optimization techniques, and performance tuning.
  • Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes) is highly desirable.
  • Proven experience working with cloud technologies (AWS/GCP/Azure) and understanding of cloud architecture principles.
  • Strong understanding of fundamental design principles behind scalable applications and microservices architecture.
  • Excellent problem-solving, analytical, and communication skills.
  • Ability to work collaboratively in a fast-paced, agile environment and lead projects to successful completion.
Read more
The Alter Office

at The Alter Office

2 candid answers
Harsha Ravindran
Posted by Harsha Ravindran
Bengaluru (Bangalore)
1 - 4 yrs
₹6L - ₹10L / yr
skill iconNodeJS (Node.js)
MySQL
SQL
skill iconMongoDB
skill iconExpress
+9 more

Job Title: Backend Developer

Location: In-Office, Bangalore, Karnataka, India


Job Summary:

We are seeking a highly skilled and experienced Backend Developer with a minimum of 1 year of experience in product building to join our dynamic and innovative team. In this role, you will be responsible for designing, developing, and maintaining robust backend systems that drive our applications. You will collaborate with cross-functional teams to ensure seamless integration between frontend and backend components, and your expertise will be critical in architecting scalable, secure, and high-performance backend solutions.


Annual Compensation: 6-10 LPA


Responsibilities:

  • Design, develop, and maintain scalable and efficient backend systems and APIs using NodeJS.
  • Architect and implement complex backend solutions, ensuring high availability and performance.
  • Collaborate with product managers, frontend developers, and other stakeholders to deliver comprehensive end-to-end solutions.
  • Design and optimize data storage solutions using relational databases (e.g., MySQL) and NoSQL databases (e.g., MongoDB, Redis).
  • Promoting a culture of collaboration, knowledge sharing, and continuous improvement.
  • Implement and enforce best practices for code quality, security, and performance optimization.
  • Develop and maintain CI/CD pipelines to automate build, test, and deployment processes.
  • Ensure comprehensive test coverage, including unit testing, and implement various testing methodologies and tools to validate application functionality.
  • Utilize cloud services (e.g., AWS, Azure, GCP) for infrastructure deployment, management, and optimization.
  • Conduct system design reviews and contribute to architectural discussions.
  • Stay updated with industry trends and emerging technologies to drive innovation within the team.
  • Implement secure authentication and authorization mechanisms and ensure data encryption for sensitive information.
  • Design and develop event-driven applications utilizing serverless computing principles to enhance scalability and efficiency.


Requirements:

  • Minimum of 1 year of proven experience as a Backend Developer, with a strong portfolio of product-building projects.
  • Extensive experience with JavaScript backend frameworks (e.g., Express, Socket) and a deep understanding of their ecosystems.
  • Strong expertise in SQL and NoSQL databases (MySQL and MongoDB) with a focus on data modeling and scalability.
  • Practical experience with Redis and caching mechanisms to enhance application performance.
  • Proficient in RESTful API design and development, with a strong understanding of API security best practices.
  • In-depth knowledge of asynchronous programming and event-driven architecture.
  • Familiarity with the entire web stack, including protocols, web server optimization techniques, and performance tuning.
  • Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes) is highly desirable.
  • Proven experience working with cloud technologies (AWS/GCP/Azure) and understanding of cloud architecture principles.
  • Strong understanding of fundamental design principles behind scalable applications and microservices architecture.
  • Excellent problem-solving, analytical, and communication skills.
  • Ability to work collaboratively in a fast-paced, agile environment and lead projects to successful completion.
Read more
TechMynd Consulting

at TechMynd Consulting

2 candid answers
Suraj N
Posted by Suraj N
Bengaluru (Bangalore), Gurugram, Mumbai
4 - 8 yrs
₹10L - ₹24L / yr
skill iconData Science
skill iconPostgreSQL
skill iconPython
Apache
skill iconAmazon Web Services (AWS)
+5 more

Senior Data Engineer


Location: Bangalore, Gurugram (Hybrid)


Experience: 4-8 Years


Type: Full Time | Permanent


Job Summary:


We are looking for a results-driven Senior Data Engineer to join our engineering team. The ideal candidate will have hands-on expertise in data pipeline development, cloud infrastructure, and BI support, with a strong command of modern data stacks. You’ll be responsible for building scalable ETL/ELT workflows, managing data lakes and marts, and enabling seamless data delivery to analytics and business intelligence teams.


This role requires deep technical know-how in PostgreSQL, Python scripting, Apache Airflow, AWS or other cloud environments, and a working knowledge of modern data and BI tools.


Key Responsibilities:


PostgreSQL & Data Modeling


· Design and optimize complex SQL queries, stored procedures, and indexes


· Perform performance tuning and query plan analysis


· Contribute to schema design and data normalization


Data Migration & Transformation


· Migrate data from multiple sources to cloud or ODS platforms


· Design schema mapping and implement transformation logic


· Ensure consistency, integrity, and accuracy in migrated data


Python Scripting for Data Engineering


· Build automation scripts for data ingestion, cleansing, and transformation


· Handle file formats (JSON, CSV, XML), REST APIs, cloud SDKs (e.g., Boto3)


· Maintain reusable script modules for operational pipelines


Data Orchestration with Apache Airflow


· Develop and manage DAGs for batch/stream workflows


· Implement retries, task dependencies, notifications, and failure handling


· Integrate Airflow with cloud services, data lakes, and data warehouses


Cloud Platforms (AWS / Azure / GCP)


· Manage data storage (S3, GCS, Blob), compute services, and data pipelines


· Set up permissions, IAM roles, encryption, and logging for security


· Monitor and optimize cost and performance of cloud-based data operations


Data Marts & Analytics Layer


· Design and manage data marts using dimensional models


· Build star/snowflake schemas to support BI and self-serve analytics


· Enable incremental load strategies and partitioning


Modern Data Stack Integration


· Work with tools like DBT, Fivetran, Redshift, Snowflake, BigQuery, or Kafka


· Support modular pipeline design and metadata-driven frameworks


· Ensure high availability and scalability of the stack


BI & Reporting Tools (Power BI / Superset / Supertech)


· Collaborate with BI teams to design datasets and optimize queries


· Support development of dashboards and reporting layers


· Manage access, data refreshes, and performance for BI tools




Required Skills & Qualifications:


· 4–6 years of hands-on experience in data engineering roles


· Strong SQL skills in PostgreSQL (tuning, complex joins, procedures)


· Advanced Python scripting skills for automation and ETL


· Proven experience with Apache Airflow (custom DAGs, error handling)


· Solid understanding of cloud architecture (especially AWS)


· Experience with data marts and dimensional data modeling


· Exposure to modern data stack tools (DBT, Kafka, Snowflake, etc.)


· Familiarity with BI tools like Power BI, Apache Superset, or Supertech BI


· Version control (Git) and CI/CD pipeline knowledge is a plus


· Excellent problem-solving and communication skills

Read more
Quanteon Solutions
DurgaPrasad Sannamuri
Posted by DurgaPrasad Sannamuri
Remote only
7 - 10 yrs
₹20L - ₹30L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconJavascript
SQL
skill iconMongoDB
+5 more

We are looking for a highly skilled Senior Software Engineer with over 7 years of experience in fullstack development using React.js and Node.js. As a senior member of our engineering team, you’ll take ownership of complex technical challenges, influence architecture decisions, mentor junior developers, and contribute to high-impact products.


Key Responsibilities:

Design, build, and maintain scalable web applications using React.js (frontend) and Node.js (backend).

Architect robust, secure, and scalable backend APIs and frontend components.

Collaborate closely with Product Managers, Designers, and DevOps to deliver end-to-end features.

Conduct code reviews, enforce best practices, and guide junior developers.

Optimize application performance, scalability, and responsiveness.

Troubleshoot, debug, and upgrade existing systems.

Stay current with new technologies and advocate for continuous improvement.


Required Qualifications:

Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.

7+ years of experience in fullstack development.

Strong expertise in React.js and related libraries (Redux, Hooks, etc.).

In-depth experience with Node.js, Express.js, and RESTful APIs.

Proficiency with JavaScript/TypeScript and modern frontend tooling (Webpack, Babel, etc.).

Experience with relational and NoSQL databases (e.g., PostgreSQL, MongoDB).

Solid understanding of CI/CD, testing (Jest, Mocha), and version control (Git).

Familiarity with cloud services (AWS/GCP/Azure) and containerization (Docker, Kubernetes) is a plus.

Excellent communication and problem-solving skills.


Nice to Have:

Experience with microservices architecture.

Knowledge of GraphQL.

Exposure to serverless computing.

Prior experience working in Agile/Scrum teams.

Read more
 Zazmic Inc

Zazmic Inc

Agency job
Remote only
5 - 8 yrs
₹10L - ₹15L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
databricks
skill iconPython
SQL
+4 more

Title: Data Engineer II (Remote – India/Portugal)

Exp: 4- 8 Years

CTC: up to 30 LPA


Required Skills & Experience:

  • 4+ years in data engineering or backend software development
  • AI / ML is important
  • Expert in SQL and data modeling
  • Strong Python, Java, or Scala coding skills
  • Experience with Snowflake, Databricks, AWS (S3, Lambda)
  • Background in relational and NoSQL databases (e.g., Postgres)
  • Familiar with Linux shell and systems administration
  • Solid grasp of data warehouse concepts and real-time processing
  • Excellent troubleshooting, documentation, and QA mindset


If interested, kindly share your updated CV to 82008 31681

Read more
Top tier global IT consulting company

Top tier global IT consulting company

Agency job
via AccioJob by AccioJobHiring Board
Pune, Hyderabad, Gurugram, Chennai
0 - 1 yrs
₹11.1L - ₹11.1L / yr
Data Structures
Algorithms
Object Oriented Programming (OOPs)
SQL
Any programming language

AccioJob is conducting an exclusive diversity hiring drive with a reputed global IT consulting company for female candidates only.


Apply Here: https://links.acciojob.com/3SmQ0Bw


Key Details:

• Role: Application Developer

• CTC: ₹11.1 LPA

• Work Location: Pune, Chennai, Hyderabad, Gurgaon (Onsite)

• Required Skills: DSA, OOPs, SQL, and proficiency in any programming language


Eligibility Criteria:

• Graduation Year: 2024–2025

• Degree: B.E/B.Tech or M.E/M.Tech

• CS/IT branches: No prior experience required

• Non-CS/IT branches: Minimum 6 months of technical experience

• Minimum 60% in UG


Selection Process:

Offline Assessment at AccioJob Skill Center(s) in:

• Pune

• Hyderabad

• Noida

• Delhi

• Greater Noida


Further Rounds for Shortlisted Candidates Only:

• Coding Test

• Code Pairing Round

• Technical Interview

• Leadership Round


Note: Candidates must bring their own laptop & earphones for the assessment.


Apply Here: https://links.acciojob.com/3SmQ0Bw

Read more
KG Microcollege
Coimbatore
5 - 8 yrs
₹5L - ₹8L / yr
skill iconJavascript
skill iconPython
SQL
skill iconHTML/CSS
skill iconReact.js
+3 more

Job Title: Full stack Developer

Location: Coimbatore

Job Type: Full Time

Experience Level: 5-8 Years


Read more
NeoGenCode Technologies Pvt Ltd
Bengaluru (Bangalore)
8 - 15 yrs
₹5L - ₹20L / yr
skill iconJava
skill iconSpring Boot
Microservices
skill iconKubernetes
Multithreading
+6 more

🔥 High Priority – Senior Lead Java Developer (10+ Years) | Bangalore – Onsite


Summary :

We are hiring Senior Lead Java Developers with 10+ years of experience for an onsite role in Bangalore.

If you're a hands-on expert with a strong background in Java, Spring Boot, Microservices, and Kubernetes, this is your opportunity to lead, mentor, and deliver high-quality solutions in a fast-paced environment.


🔹 Position : Senior Lead Java Developer

🔹 Experience : 10+ Years (12+ preferred)

🔹 Location : Bangalore (Onsite)

🔹 Openings : 6+

Must-Have Skills :

  • 8+ years of hands-on experience with Core Java & Spring Boot
  • Expertise in Multithreading, Dependency Injection, and AOP
  • Strong in Microservices Architecture and RESTful services
  • Good exposure to SQL & NoSQL databases
  • Proficient with Git (GitLab preferred)
  • Experience with Kubernetes deployments and APM tools (New Relic preferred)
  • Solid understanding of distributed tracing and log analysis
  • Proven debugging and performance optimization skills

💼 Responsibilities :

  • Design and develop high-quality, scalable microservices
  • Act as SME for multiple services or subsystems
  • Own service performance, SLAs, and incident resolutions
  • Mentor junior developers and conduct technical interviews
  • Participate in production war rooms and troubleshooting
  • Lead development efforts and drive code quality

🎓 Qualification :

  • BE/B.Tech or equivalent degree
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore)
8 - 12 yrs
₹12L - ₹25L / yr
ETL
SQL
Snow flake schema
  • 8-10 years of experience in ETL Testing, Snowflake, DWH Concepts.
  • Strong SQL knowledge & debugging skills are a must.
  • Experience in Azure and Snowflake Testing is plus
  • Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
  • Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
  • Experience in JIRA, Xray defect management tool is good to have.
  • Exposure to the financial domain knowledge is considered a plus.
  • Testing the data-readiness (data quality) address code or data issues
  • Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
  • Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
  • Prior experience with State Street and Charles River Development (CRD) considered a plus
  • Experience in tools such as PowerPoint, Excel, SQL
  • Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus


Read more
Leading HealthTech, a U.S.-based product company

Leading HealthTech, a U.S.-based product company

Agency job
via Recruiting Bond by Pavan Kumar
Bengaluru (Bangalore), Mumbai
9 - 13 yrs
₹35L - ₹45L / yr
skill iconJava
J2EE
WebLogic
Spring
Apache Camel
+18 more

🚀 We're Hiring: Technical Lead – Java Backend & Integration

📍 Bangalore | Hybrid | Full-Time

👨‍💻 9+ Years Experience | Enterprise Product Development

🏥 Healthcare Tech | U.S. Health Insurance Domain

Join Leading HealthTech, a U.S.-based product company driving innovation in the $1.1 trillion health insurance industry. We power over 81 million lives, with 130+ customers and 100+ third-party integrations. At our growing Bangalore tech hub, you’ll solve real-world, large-scale problems and help modernize one of the most stable and impactful industries in the world.


🔧 What You'll Work On:

  • Architect and build backend & integration solutions using Java, J2EE, WebLogic, Spring, Apache Camel
  • Transition monolith systems to microservices-based architecture
  • Lead design reviews, customer discussions, code quality, UAT & production readiness
  • Work with high-volume transactional systems processing millions of health claims daily
  • Coach & mentor engineers, contribute to platform modernization


🧠 What You Bring:

  • 9+ years in backend Java development and enterprise system integration
  • Hands-on with REST, SOAP, JMS, SQL, stored procedures, XML, ESBs
  • Solid understanding of SOA, data structures, system design, and performance tuning
  • Experience with Agile, CI/CD, unit testing, and code quality tools
  • Healthcare/payor domain experience is a huge plus!


💡 Why this opportunity?

  • Global product impact from our India technology center
  • Work on mission-critical systems in a stable and recession-resilient sector
  • Be part of a journey to modernize healthcare through tech
  • Solve complex challenges at scale that few companies offer

🎯 Ready to drive change at the intersection of tech and healthcare?

Read more
Prismberry Technologies Pvt Ltd
Reshika Mendiratta
Posted by Reshika Mendiratta
Noida
3yrs+
Upto ₹18L / yr (Varies
)
skill iconPython
Airflow
Apache Airflow
pandas
pytest
+2 more

About the Role:

We are looking for a proactive and independent Data Engineer to join our team and take ownership of automating data loading processes into our databases. This role is critical in establishing and enforcing strict data standards, with the new hire playing a key role in defining and building out these processes. The ideal candidate will be hands-on, able to work independently, and trusted to write clean, maintainable code.


Key Responsibilities:

  • Automate data loading processes into our databases using task automation tools (e.g., Prefect, Airflow).
  • Define, implement, and enforce data standards and validation processes.
  • Scrape data from various sources, investigate data issues, and ensure clean, properly formatted data is ingested.
  • Write reliable, well-tested Python code, including using frameworks like Pandas and pytest.
  • Collaborate with the team to improve data pipelines and troubleshoot issues as needed.


Required Skills:

  • Strong Python programming skills.
  • Experience with task automation tools such as Prefect or Airflow.
  • Hands-on experience with Pandas and testing frameworks like pytest.
  • Strong SQL skills with the ability to work across complex datasets.
  • Experience in data scraping and cleaning, with attention to data quality and integrity.


Nice to Have:

  • Familiarity with Pydantic, FastAPI, and mocking libraries.
  • Experience building and optimizing APIs or related services.


Who You Are:

  • A mid-level engineer who can work independently and take full ownership of tasks.
  • Detail-oriented with a strong sense of responsibility toward data accuracy and quality.
  • Comfortable writing clean, tested code and contributing to process improvements.
Read more
Remote only
2 - 5 yrs
₹5L - ₹8L / yr
skill iconPython
NumPy
PyTorch
pandas
Data Visualization
+5 more

Python Developer

We are looking for an enthusiastic and skilled Python Developer with a passion for AI-based application development to join our growing technology team. This position offers the opportunity to work at the intersection of software engineering and data analytics, contributing to innovative AIdriven solutions that drive business impact. If you have a strong foundation in Python, a flair for problem-solving, and an eagerness to build intelligent systems, we would love to meet you!

Key Responsibilities

• Develop and deploy AI-focused applications using Python and associated frameworks.

• Collaborate with Developers, Product Owners, and Business Analysts to design and implement machine learning pipelines.

• Create interactive dashboards and data visualizations for actionable insights.

• Automate data collection, transformation, and processing tasks.

• Utilize SQL for data extraction, manipulation, and database management.

• Apply statistical methods and algorithms to derive insights from large datasets.

Required Skills and Qualifications

• 2–3 years of experience as a Python Developer, with a strong portfolio of relevant projects.

• Bachelor’s degree in Computer Science, Data Science, or a related technical field.

• In-depth knowledge of Python, including frameworks and libraries such as NumPy, Pandas, SciPy, and PyTorch.

• Proficiency in front-end technologies like HTML, CSS, and JavaScript.

• Familiarity with SQL and NoSQL databases and their best practices.

• Excellent communication and team-building skills.

• Strong problem-solving abilities with a focus on innovation and self-learning.

• Knowledge of cloud platforms such as AWS is a plus.

Additional Requirements This opportunity enhances your work life balance with allowance for remote work.


To be successful your computer hardware and internet must meet these minimum requirements:

1. Laptop or Desktop: • Operating System: Windows • Screen Size: 14 Inches • Screen Resolution: FHD (1920×1080) • Processor: I5 or higher • RAM: Minimum 8GB (Must) • Type: Windows Laptop • Software: AnyDesk • Internet Speed: 100 MBPS or higher


About ARDEM

ARDEM is a leading Business Process Outsourcing and Business Process Automation Service provider. For over twenty years ARDEM has successfully delivered business process outsourcing and business process automation services to our clients in USA and Canada. We are growing rapidly. We are constantly innovating to become a better service provider for our customers. We continuously strive for excellence to become the Best Business Process Outsourcing and Business Process Automation company

Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
4 - 8 yrs
Best in industry
skill icon.NET
ASP.NET
skill iconC#
Entity Framework
LINQ
+6 more

About NonStop io Technologies:

NonStop io Technologies is a value-driven company with a strong focus on process-oriented software engineering. We specialize in Product Development and have a decade's worth of experience in building web and mobile applications across various domains. NonStop io Technologies follows core principles that guide its operations and believes in staying invested in a product's vision for the long term. We are a small but proud group of individuals who believe in the 'givers gain' philosophy and strive to provide value in order to seek value. We are committed to and specialize in building cutting-edge technology products and serving as trusted technology partners for startups and enterprises. We pride ourselves on fostering innovation, learning, and community engagement. Join us to work on impactful projects in a collaborative and vibrant environment.


Brief Description:

NonStop io is seeking a proficient .NET Developer to join our growing team. You will be responsible for developing, enhancing, and maintaining scalable applications using .NET technologies. This role involves working on a healthcare-focused product and requires strong problem-solving skills, attention to detail, and a passion for software development.


Responsibilities:

  • Design, develop, and maintain applications using .NET Core/.NET Framework, C#, and related technologies
  • Write clean, scalable, and efficient code while following best practices
  • Develop and optimize APIs and microservices
  • Work with SQL Server and other databases to ensure high performance and reliability
  • Collaborate with cross-functional teams, including UI/UX designers, QA, and DevOps
  • Participate in code reviews and provide constructive feedback
  • Troubleshoot, debug, and enhance existing applications
  • Ensure compliance with security and performance standards, especially for healthcare-related applications


Qualifications & Skills:

  • Strong experience in .NET Core/.NET Framework and C#
  • Proficiency in building RESTful APIs and microservices architecture
  • Experience with Entity Framework, LINQ, and SQL Server
  • Familiarity with front-end technologies like React, Angular, or Blazor is a plus
  • Knowledge of cloud services (Azure/AWS) is a plus
  • Experience with version control (Git) and CI/CD pipelines
  • Strong understanding of object-oriented programming (OOP) and design patterns
  • Prior experience in healthcare tech or working with HIPAA-compliant systems is a plus


Why Join Us?

  • Opportunity to work on a cutting-edge healthcare product
  • A collaborative and learning-driven environment
  • Exposure to AI and software engineering innovations
  • Excellent work ethics and culture

If you're passionate about technology and want to work on impactful projects, we'd love to hear from you!

Read more
Trellissoft Inc.

at Trellissoft Inc.

3 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
5yrs+
Upto ₹18L / yr (Varies
)
skill iconPython
skill iconFlask
Microservices
SQL
NOSQL Databases

Job Responsibilities:

  • Design, develop, test, and maintain high-performance web applications and backend services using Python.
  • Build scalable, secure, and reliable backend systems and APIs.
  • Optimize and debug existing codebases to enhance performance and maintainability.
  • Collaborate closely with cross-functional teams to gather requirements and deliver high-quality solutions.
  • Mentor junior developers, conduct code reviews, and uphold best coding practices.
  • Write clear, comprehensive technical documentation for internal and external use.
  • Stay current with emerging technologies, tools, and industry trends to continually improve development processes.

Qualifications:

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 5+ years of hands-on experience in Python development.
  • Strong expertise Flask.
  • In-depth understanding of software design principles, architecture, and design patterns.
  • Proven experience working with both SQL and NoSQL databases.
  • Solid debugging and problem-solving capabilities.
  • Effective communication and collaboration skills, with a team-first mindset.

Technical Skills:

  • Programming: Python (Advanced)
  • Web Frameworks: Flask
  • Databases: PostgreSQL, MySQL, MongoDB, Redis
  • Version Control: Git
  • API Development: RESTful APIs
  • Containerization & Orchestration: Docker, Kubernetes
  • Cloud Platforms: AWS or Azure (hands-on experience preferred)
  • DevOps: CI/CD pipelines (e.g., Jenkins, GitHub Actions)


Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Remote only
5 - 7 yrs
₹12L - ₹16L / yr
skill iconPython
Google Cloud Platform (GCP)
SQL
PySpark
Data Transformation Tool (DBT)
+2 more

Role: GCP Data Engineer

Notice Period: Immediate Joiners

Experience: 5+ years

Location: Remote

Company: Deqode


About Deqode

At Deqode, we work with next-gen technologies to help businesses solve complex data challenges. Our collaborative teams build reliable, scalable systems that power smarter decisions and real-time analytics.


Key Responsibilities

  • Build and maintain scalable, automated data pipelines using Python, PySpark, and SQL.
  • Work on cloud-native data infrastructure using Google Cloud Platform (BigQuery, Cloud Storage, Dataflow).
  • Implement clean, reusable transformations using DBT and Databricks.
  • Design and schedule workflows using Apache Airflow.
  • Collaborate with data scientists and analysts to ensure downstream data usability.
  • Optimize pipelines and systems for performance and cost-efficiency.
  • Follow best software engineering practices: version control, unit testing, code reviews, CI/CD.
  • Manage and troubleshoot data workflows in Linux environments.
  • Apply data governance and access control via Unity Catalog or similar tools.


Required Skills & Experience

  • Strong hands-on experience with PySpark, Spark SQL, and Databricks.
  • Solid understanding of GCP services (BigQuery, Cloud Functions, Dataflow, Cloud Storage).
  • Proficiency in Python for scripting and automation.
  • Expertise in SQL and data modeling.
  • Experience with DBT for data transformations.
  • Working knowledge of Airflow for workflow orchestration.
  • Comfortable with Linux-based systems for deployment and troubleshooting.
  • Familiar with Git for version control and collaborative development.
  • Understanding of data pipeline optimization, monitoring, and debugging.
Read more
builds holistic technology solutions for the entertainment and leisure industry. We help you automate key processes and manage them centrally. Our hardware and software solutions are designed to create delightful experiences for your customers, while also making your business more robust and your staff more productive.

builds holistic technology solutions for the entertainment and leisure industry. We help you automate key processes and manage them centrally. Our hardware and software solutions are designed to create delightful experiences for your customers, while also making your business more robust and your staff more productive.

Agency job
via HyrHub by Shwetha Naik
Bengaluru (Bangalore), Mangalore
4 - 6 yrs
₹8L - ₹10L / yr
Windows Presentation Foundation(WPF)
User Experience (UX) Design
skill iconC#
skill icon.NET
SQL
+3 more

1. 4 - 7 years working as a professional WPF UI developer.

2. Proficient knowledge of WPF and .NET C#.

3. Proficient understanding of UX design principles and creating responsive layouts.

4. Good understanding of SQL and REST APIs

5. Excellent analytical and multitasking skills.

Read more
Quanteon Solutions
DurgaPrasad Sannamuri
Posted by DurgaPrasad Sannamuri
Hyderabad
3 - 5 yrs
₹8L - ₹20L / yr
SQL
skill iconPostgreSQL
Databases
ETL
Data Visualization
+8 more

We’re looking for an experienced SQL Developer with 3+ years of hands-on experience to join our growing team. In this role, you’ll be responsible for designing, developing, and maintaining SQL queries, procedures, and data systems that support our business operations and decision-making processes. You should be passionate about data, highly analytical, and capable of working both independently and collaboratively with cross-functional teams.


Key Responsibilities:


Design, develop, and maintain complex SQL queries, stored procedures, functions, and views.

Optimize existing queries for performance and efficiency.

Collaborate with data analysts, developers, and stakeholders to understand requirements and translate them into robust SQL solutions.

Design and implement ETL processes to move and transform data between systems.

Perform data validation, troubleshooting, and quality checks.

Maintain and improve existing databases, ensuring data integrity, security, and accessibility.

Document code, processes, and data models to support scalability and maintainability.

Monitor database performance and provide recommendations for improvement.

Work with BI tools and support dashboard/report development as needed.


Requirements:

3+ years of proven experience as an SQL Developer or in a similar role.

Strong knowledge of SQL and relational database systems (e.g., MS SQL Server, PostgreSQL, MySQL, Oracle).

Experience with performance tuning and optimization.

Proficiency in writing complex queries and working with large datasets.

Experience with ETL tools and data pipeline creation.

Familiarity with data warehousing concepts and BI reporting.

Solid understanding of database security, backup, and recovery.

Excellent problem-solving skills and attention to detail.

Good communication skills and ability to work in a team environment.


Nice to Have:


Experience with cloud-based databases (AWS RDS, Google BigQuery, Azure SQL).

Knowledge of Python, Power BI, or other scripting/analytics tools.

Experience working in Agile or Scrum environments.

Read more
Ketto

at Ketto

1 video
3 recruiters
Sagar Ganatra
Posted by Sagar Ganatra
Mumbai
1 - 3 yrs
₹10L - ₹15L / yr
Tableau
PowerBI
SQL
skill iconPython
Dashboard
+5 more

About the company:


Ketto is Asia's largest tech-enabled crowdfunding platform with a vision - Healthcare for all. We are a profit-making organization with a valuation of more than 100 Million USD. With over 1,100 crores raised from more than 60 lakh donors, we have positively impacted the lives of 2 lakh+ campaigners. Ketto has embarked on a high-growth journey, and we would like you to be part of our family, helping us to create a large-scale impact on a daily basis by taking our product to the next level



Role Overview:


Ketto, Asia's largest crowdfunding platform, is looking for an innovative Product Analyst to take charge of our data systems, reporting frameworks, and generative AI initiatives. This role is pivotal in ensuring data integrity and reliability, driving key insights that fuel strategic decisions, and implementing automation through AI. This position encompasses the full data and analytics lifecycle—from requirements gathering to design planning—alongside implementing advanced analytics and generative AI solutions to support Ketto’s mission.


Key Responsibilities


●  Data Strategy & Automation:

○ Lead data collection, processing, and quality assurance processes to ensure accuracy, completeness, and relevance.

○ Explore opportunities to incorporate generative AI models to automate and optimize processes, enhancing efficiencies in analytics, reporting, and decision-making.


●  Data Analysis & Insight Generation:

○ Conduct in-depth analyses of user behaviour, campaign performance, and platform metrics to uncover insights that support crowdfunding success.

○ Translate complex data into clear, actionable insights that drive strategic decisions, providing stakeholders with the necessary information to enhance business outcomes.


●  Reporting & Quality Assurance:

○ Design and maintain a robust reporting framework to deliver timely insights, enhancing data reliability and ensuring stakeholders are well-informed.

○ Monitor and improve data accuracy, consistency, and integrity across all data processes, identifying and addressing areas for enhancement.


●  Collaboration & Strategic Planning:

○ Work closely with Business, Product, and IT teams to align data initiatives with Ketto’s objectives and growth strategy.

○ Propose data-driven strategies that leverage AI and automation to tackle business challenges and scale impact across the platform.

○ Mentor junior data scientists and analysts, fostering a culture of data-driven decision-making.


Required Skills and Qualifications


●  Technical Expertise:

○ Strong background in SQL, Statistics and Maths


●  Analytical & Strategic Mindset:

○ Proven ability to derive meaningful, actionable insights from large data sets and translate findings into business strategies.

○ Experience with statistical analysis, advanced analytics


●  Communication & Collaboration:

○ Exceptional written and verbal communication skills, capable of explaining complex data insights to non-technical stakeholders.

○ Strong interpersonal skills to work effectively with cross-functional teams, aligning data initiatives with organisational goals.


●  Preferred Experience:

○ Proven experience in advanced analytics roles

○ Experience leading data lifecycle management, model deployment, and quality assurance initiatives.


Why Join Ketto?

At Ketto, we’re committed to driving social change through innovative data and AI solutions. As our Sr. Product Analyst, you’ll have the unique opportunity to leverage advanced data science and generative AI to shape the future of crowdfunding in Asia. If you’re passionate about using data and AI for social good, we’d love to hear from you!

Read more
Hunarstreet Technologies pvt ltd

Hunarstreet Technologies pvt ltd

Agency job
Ahmedabad
3 - 8 yrs
₹6L - ₹8L / yr
skill iconNodeJS (Node.js)
NOSQL Databases
SQL
RESTful APIs
skill iconPostgreSQL
+7 more

Designation – Nodejs Developer

Experience – Min 3+ Yrs


Location: Ahmedabad ( WFO)

We are seeking a highly skilled Senior Node.js Developer with expertise in SQL and MongoDB to join our dynamic team.

As a key member of our development team, you will be responsible for managing the interchange of data between the server and users, as well as developing server-side logic. Your primary focus will be on the development of all server-side logic, ensuring high performance and responsiveness to requests from the front-end. Additionally, your experience with both SQL and NoSQL databases will be crucial in defining and maintaining our data storage solutions.

Responsibilities:

 Develop and maintain server-side applications using Node.js.

 Design and implement RESTful APIs for seamless integration with front-end applications.

 Collaborate with front-end developers to integrate user-facing elements with server-side logic.

 Optimize applications for maximum speed and scalability.

 Implement security and data protection measures.

 Design and maintain database schemas for both SQL (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB) databases.

 Manage and mentor junior developers, providing technical guidance and support.

 Stay updated with emerging technologies and industry best practices.


Requirements:

 Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience.

 Minimum 4 years of experience in Node.js development.

 Proficiency in JavaScript/TypeScript and frameworks such as Express.js.

 Strong understanding of asynchronous programming and event-driven architecture.

 Experience with SQL databases (e.g., MySQL, PostgreSQL) and proficiency in writing complex SQL queries.

 Experience with NoSQL databases (e.g., MongoDB) and familiarity with their query languages.

 Familiarity with ORM libraries (e.g., Sequelize, Mongoose) for database interaction.

 Knowledge of version control systems (e.g., Git).

 Understanding of CI/CD pipelines and deployment processes.

 Excellent communication and teamwork skills.

 Ability to lead and mentor a team of developers.

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Mumbai
1.5 - 1.8 yrs
₹1L - ₹6L / yr
Production support
Linux/Unix
SQL

A Production Support Engineer ensures the smooth operation of software applications and IT systems in a production environment. Here’s a breakdown of the role:

Key Responsibilities

  • Monitoring System Performance: Continuously track application health and resolve performance issues.
  • Incident Management: Diagnose and fix software failures, collaborating with developers and system administrators.
  • Troubleshooting & Debugging: Analyze logs, use diagnostic tools, and implement solutions to improve system reliability.
  • Documentation & Reporting: Maintain records of system issues, resolutions, and process improvements.
  • Collaboration: Work with cross-functional teams to enhance system efficiency and reduce downtime.
  • Process Optimization: Suggest improvements to reduce production costs and enhance system stability.

Required Skills

  • Strong knowledge of SQL, UNIX/Linux, Java, Oracle, and Splunk.
  • Experience in incident management and debugging.
  • Ability to analyze system failures and optimize performance.
  • Good communication and problem-solving skills.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore)
1 - 3 yrs
₹5L - ₹17L / yr
skill iconPython
SQL
ETL
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)

Job Summary:

We are looking for a motivated and detail-oriented Data Engineer with 1–2 years of experience to join our data engineering team. The ideal candidate should have solid foundational skills in SQL and Python, along with exposure to building or maintaining data pipelines. You’ll play a key role in helping to ingest, process, and transform data to support various business and analytical needs.

Key Responsibilities:

  • Assist in the design, development, and maintenance of scalable and efficient data pipelines.
  • Write clean, maintainable, and performance-optimized SQL queries.
  • Develop data transformation scripts and automation using Python.
  • Support data ingestion processes from various internal and external sources.
  • Monitor data pipeline performance and help troubleshoot issues.
  • Collaborate with data analysts, data scientists, and other engineers to ensure data quality and consistency.
  • Work with cloud-based data solutions and tools (e.g., AWS, Azure, GCP – as applicable).
  • Document technical processes and pipeline architecture.

Core Skills Required:

  • Proficiency in SQL (data querying, joins, aggregations, performance tuning).
  • Experience with Python, especially in the context of data manipulation (e.g., pandas, NumPy).
  • Exposure to ETL/ELT pipelines and data workflow orchestration tools (e.g., Airflow, Prefect, Luigi – preferred).
  • Understanding of relational databases and data warehouse concepts.
  • Familiarity with version control systems like Git.

Preferred Qualifications:

  • Experience with cloud data services (AWS S3, Redshift, Azure Data Lake, etc.)
  • Familiarity with data modeling and data integration concepts.
  • Basic knowledge of CI/CD practices for data pipelines.
  • Bachelor’s degree in Computer Science, Engineering, or related field.


Read more
Remote, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
2 - 5 yrs
₹3L - ₹9L / yr
SQL
skill iconXML
JSON
TDL

Job Description:

As a Tally Developer, your main responsibility will be to develop custom solutions in Tally using TDL as per the customer requirements. You will work closely with clients, business analysts, Senior developers, and other stakeholders to understand their requirements and translate them into effective Tally-based solutions.

Responsibilities:

Collaborate business analysts and senior developer/project manager to gather and analyses client requirements.

Design, develop, and customize Tally-based software solutions to meet the specific requirements of clients.

Write efficient and well-documented code in Tally Definition Language (TDL) to extend the functionality of Tally software.

Follow the Software Development Life Cycle including requirements gathering, design, coding, testing, and deployment.

Troubleshoot and debug issues related to Tally customization, data import/export, and software integrations.

Provide technical support and assistance to clients and end-users in utilizing and troubleshooting Tally-based software solutions.

Stay updated with the latest features and updates in Tally software to leverage new functionalities in solution development.

Adhere to coding standards, documentation practices, and quality assurance processes.

Requirements:

Any Degree. Relevant work experience may be considered in place of a degree.

Experience in Tally development and customization for projects using Tally Definition Language (TDL).

Hands-on experience in Tally and implementation of its features.

Familiarity with database systems, data structures, and SQL for efficient data management and retrieval.

Strong problem-solving skills and attention to detail.

Good communication and teamwork abilities.

Continuous learning mindset to keep up with advancements in Tally software and related technologies.

Key Skills Required:

TDL (Tally Definition Language), Tally, Excel, XML/JSON.

Good to have Basic Skills:

Database like MS SQL, MySQL

API Integration.

WORK EXPERIENCE- MINIMUM 2 YEARS AND MAXIMUM 7 YEARS

Interested candidate may what's app their cv on TRIPLE NINE ZERO NINE THREE DOUBLE ONE DOUBLE FOURE.


Please answer the below question?

Do you have knowledge of Tally Definition Language?

How many experience do you have as TDL?

Read more
Intain Technologies
Chennai
2 - 5 yrs
₹9L - ₹14L / yr
skill iconPython
Generative AI
skill iconData Analytics
Natural Language Processing (NLP)
SQL
+2 more

About Us

Intain is building a blockchain-based servicing platform for structured finance, backed by top VCs and already managing $5B+ in transactions. We're a 40-member team across India, Singapore & NYC, and 50% of our team are women. We blend AI + blockchain to solve real problems in capital markets.

🎥 What we do →

🧠 What You’ll Work On

  • Build full-stack AI-driven fintech applications from scratch
  • Design scalable microservices & integrate APIs with external systems (banking, RPA tools like Blue Prism/UI Path)
  • Use GenAI tools (ChatGPT-4, Gemini) to solve real NLP use cases
  • Drive cloud-native development on Azure, CI/CD, and DevSecOps workflows
  • Collaborate with cross-functional teams in a flat, Agile environment

🛠️ Skills We're Looking For

  • Frontend: React.js
  • Backend: Python (Flask preferred), REST APIs
  • AI/NLP: ChatGPT / Gemini / GenAI tools
  • DBs: PostgreSQL / MySQL, MongoDB
  • Tools: RabbitMQ, Git, Jenkins, Docker
  • Cloud: Azure (preferred)
  • Testing: Jest / Cypress
  • Agile, startup-ready mindset

🌟 Bonus Points

  • Angular, Redis, Elasticsearch
  • UI/UX knowledge
  • Security & accessibility best practices

🚀 Why Join Us?

  • Work on cutting-edge AI & blockchain tech
  • Flat team, fast decisions, global clients
  • Remote flexibility + strong team culture
  • Competitive compensation


Read more
MyOperator - VoiceTree Technologies

at MyOperator - VoiceTree Technologies

1 video
2 recruiters
Vijay Muthu
Posted by Vijay Muthu
Noida
1 - 2 yrs
₹6L - ₹8L / yr
SQL
skill iconAmazon Web Services (AWS)
AWS CloudFormation
skill iconPython
Quicksight
+2 more

Job Description:

We are seeking a highly analytical and detail-oriented Data Analyst to join our team. The ideal candidate will have strong problem-solving skills, proficiency in SQL and AWS QuickSight, and a passion for extracting meaningful insights from data. You will be responsible for analyzing complex datasets, building reports and dashboards, and providing data-driven recommendations to support business decisions.


Key Responsibilities:

  • Extract, transform, and analyze data from multiple sources to generate actionable insights.
  • Develop interactive dashboards and reports in AWS QuickSight to visualize trends and key metrics.
  • Write optimized SQL queries to retrieve and manipulate data efficiently.
  • Collaborate with stakeholders to understand business requirements and provide analytical solutions.
  • Identify patterns, trends, and statistical correlations in data to support strategic decision-making.
  • Ensure data integrity, accuracy, and consistency across reports.
  • Continuously explore new tools, techniques, and methodologies to enhance analytical capabilities.

Qualifications & Skills:

  • Strong proficiency in SQL for querying and data manipulation.
  • Hands-on experience with AWS QuickSight for data visualization and reporting.
  • Strong analytical thinking and problem-solving skills with the ability to interpret complex data.
  • Experience working with large datasets and relational databases.
  • Passion for slicing and dicing data to uncover key insights.
  • Exceptional communication skills to effectively understand business requirements and present insights.
  • A growth mindset with a strong attitude for continuous learning and improvement.

Preferred Qualifications:

  • Experience with Python is a plus.
  • Familiarity with cloud-based data environments (AWS etc).
  • Familiarity with leveraging existing LLMs/AI tools to enhance productivity, automate repetitive tasks, and improve analysis efficiency.


Read more
Pattem Digital Technologies
Sanchari Sharma
Posted by Sanchari Sharma
Remote only
5 - 10 yrs
₹10L - ₹20L / yr
SQL
adobe campaign classic tool

The Consultant / Senior Consultant – Adobe Campaign is a technical role that requires providing Consulting advice and support to Clients for Implementing Adobe Campaign solution and any technical advisory required afterwards. This is a client-facing role and requires consultant to liaise with the client, understand their technical and business requirements and then Implement Adobe Campaign solution in a manner client gets most value out of the solution. Consultant’s main objective is to drive successful delivery and maintaining a high level of satisfaction for our customer.

What you need to succeed

• Expertise and Experience in SQL (Oracle / SQL Server / PostgreSQL) • Programming experience (Javascript / Java / VB / C# / PHP)

• Knowledge on Web Technologies like HTML, CSS would be a plus

• Good communication skills to ensure effective customer interactions, communications, and documentation

• Self-starter - Organized and highly motivated

• Fast learner, ability to learn new technologies/languages

• Knowledge of HTML DOM manipulation and page load events a plus

• Project Management skills a plus

• Ability to develop creative solutions to problems

• Able to multi-task in a dynamic environment

• Able to work independently with minimal supervision

• Experience leading team members will be a plus Adobe is an equal opportunity/affirmative action employer. We welcome and encourage diversity in the workplace.


Read more
ZeMoSo Technologies

at ZeMoSo Technologies

11 recruiters
Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Chennai, Pune
4 - 8 yrs
₹10L - ₹15L / yr
Data engineering
skill iconPython
SQL
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)
+3 more

Work Mode: Hybrid


Need B.Tech, BE, M.Tech, ME candidates - Mandatory



Must-Have Skills:

● Educational Qualification :- B.Tech, BE, M.Tech, ME in any field.

● Minimum of 3 years of proven experience as a Data Engineer.

● Strong proficiency in Python programming language and SQL.

● Experience in DataBricks and setting up and managing data pipelines, data warehouses/lakes.

● Good comprehension and critical thinking skills.


● Kindly note Salary bracket will vary according to the exp. of the candidate - 

- Experience from 4 yrs to 6 yrs - Salary upto 22 LPA

- Experience from 5 yrs to 8 yrs - Salary upto 30 LPA

- Experience more than 8 yrs - Salary upto 40 LPA

Read more
Deltek
shwetha V
Posted by shwetha V
Remote only
8 - 12 yrs
Best in industry
skill iconPython
skill icon.NET
Apache Airflow
skill iconReact.js
skill iconJavascript
+4 more

Title - Pncpl Software Engineer

Company Summary :

As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com

Business Summary :

The Deltek Engineering and Technology team builds best-in-class solutions to delight customers and meet their business needs. We are laser-focused on software design, development, innovation and quality. Our team of experts has the talent, skills and values to deliver products and services that are easy to use, reliable, sustainable and competitive. If you're looking for a safe environment where ideas are welcome, growth is supported and questions are encouraged – consider joining us as we explore the limitless opportunities of the software industry.

Principal Software Engineer

Position Responsibilities :

  • Develop and manage integrations with third-party services and APIs using industry-standard protocols like OAuth2 for secure authentication and authorization.
  • Develop scalable, performant APIs for Deltek products
  • Accountability for the successful implementation of the requirements by the team.
  • Troubleshoot, debug, and optimize code and workflows for better performance and scalability.
  • Undertake analysis, design, coding and testing activities of complex modules
  • Support the company’s development processes and development guidelines including code reviews, coding style and unit testing requirements.
  • Participate in code reviews and provide mentorship to junior developers.
  • Stay up-to-date with emerging technologies and best practices in Python development, AWS, and frontend frameworks like React. And suggest optimisations based on them
  • Adopt industry best practices in all your projects - TDD, CI/CD, Infrastructure as Code, linting
  • Pragmatic enough to deliver an MVP, but aspirational enough to think about how it will work with millions of users and adapt to new challenges
  • Readiness to hit the ground running – you may not know how to solve everything right off the bat, but you will put in the time and effort to understand so that you can design architecture of complex features with multiple components.

Qualifications :

  • A college degree in Computer Science, Software Engineering, Information Science or a related field is required 
  • Minimum 8-10 years of experience Sound programming skills on Python, .Net platform (VB & C#), TypeScript / JavaScript, Frontend technologies like React.js/Ember.js, SQL Db (like PostgreSQL)
  • Experience in backend development and Apache Airflow (or equivalent framework).
  • Build APIs and optimize SQL queries with performance considerations.
  • Experience with Agile Development
  • Experience in writing and maintaining unit tests and using testing frameworks is desirable
  • Exposure to Amazon Web Services (AWS) technologies, Terraform, Docker is a plus
  • Strong desire to continually improve knowledge and skills through personal development activities and apply their knowledge and skills to continuous software improvement.
  • The ability to work under tight deadlines, tolerate ambiguity and work effectively in an environment with multiple competing priorities.
  • Strong problem-solving and debugging skills.
  • Ability to work in an Agile environment and collaborate with cross-functional teams.
  • Familiarity with version control systems like Git.
  • Excellent communication skills and the ability to work effectively in a remote or hybrid team setting.

Read more
Deltek
shwetha V
Posted by shwetha V
Remote only
4 - 7 yrs
Best in industry
skill iconPython
skill icon.NET
skill iconJava
Apache Airflow
TypeScript
+6 more

Title - Sr Software Engineer

Company Summary :


As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com


Business Summary :


The Deltek Engineering and Technology team builds best-in-class solutions to delight customers and meet their business needs. We are laser-focused on software design, development, innovation and quality. Our team of experts has the talent, skills and values to deliver products and services that are easy to use, reliable, sustainable and competitive. If you're looking for a safe environment where ideas are welcome, growth is supported and questions are encouraged – consider joining us as we explore the limitless opportunities of the software industry.

External Job Title :


Sr Software Engineer

Position Responsibilities :


  • Develop and manage integrations with third-party services and APIs using industry-standard protocols like OAuth2 for secure authentication and authorization.
  • Develop scalable, performant APIs for Deltek products
  • Accountability for the successful implementation of the requirements by the team.
  • Troubleshoot, debug, and optimize code and workflows for better performance and scalability.
  • Undertake analysis, design, coding and testing activities of complex modules
  • Support the company’s development processes and development guidelines including code reviews, coding style and unit testing requirements.
  • Participate in code reviews and provide mentorship to junior developers.
  • Stay up-to-date with emerging technologies and best practices in Python development, AWS, and frontend frameworks like React.
  • Adopt industry best practices in all your projects - TDD, CI/CD, Infrastructure as Code, linting
  • Pragmatic enough to deliver an MVP, but aspirational enough to think about how it will work with millions of users and adapt to new challenges
  • Readiness to hit the ground running – you may not know how to solve everything right off the bat, but you will put in the time and effort to understand so that you can design architecture of complex features with multiple components.


Qualifications :


  • A college degree in Computer Science, Software Engineering, Information Science or a related field is required 
  • Minimum 4-6 years of experience Sound programming skills on Python, .Net platform (VB & C#), TypeScript / JavaScript, Frontend technologies like React.js/Ember.js, SQL Db (like PostgreSQL)
  • Experience in backend development and Apache Airflow (or equivalent framework).
  • Build APIs and optimize SQL queries with performance considerations.
  • Experience with Agile Development
  • Experience in writing and maintaining unit tests and using testing frameworks is desirable
  • Exposure to Amazon Web Services (AWS) technologies, Terraform, Docker is a plus
  • Strong desire to continually improve knowledge and skills through personal development activities and apply their knowledge and skills to continuous software improvement.
  • The ability to work under tight deadlines, tolerate ambiguity and work effectively in an environment with multiple competing priorities.
  • Strong problem-solving and debugging skills.
  • Ability to work in an Agile environment and collaborate with cross-functional teams.
  • Familiarity with version control systems like Git.
  • Excellent communication skills and the ability to work effectively in a remote or hybrid team setting.
Read more
Data Axle

at Data Axle

2 candid answers
Eman Khan
Posted by Eman Khan
Pune
7 - 10 yrs
Best in industry
Google Cloud Platform (GCP)
ETL
skill iconPython
skill iconJava
skill iconScala
+4 more

About Data Axle:

Data Axle Inc.  has been an industry leader in data, marketing solutions, sales and research for over 45 years in the USA. Data Axle has set up a strategic global center of excellence in Pune. This center delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business & consumer databases.  Data Axle is headquartered in Dallas, TX, USA.


Roles and Responsibilities:

  • Design, implement, and manage scalable analytical data infrastructure, enabling efficient access to large datasets and high-performance computing on Google Cloud Platform (GCP).
  • Develop and optimize data pipelines using GCP-native services like BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Data Fusion, and Cloud Storage.
  • Work with diverse data sources to extract, transform, and load data into enterprise-grade data lakes and warehouses, ensuring high availability and reliability.
  • Implement and maintain real-time data streaming solutions using Pub/Sub, Dataflow, and Kafka.
  • Research and integrate the latest big data and visualization technologies to enhance analytics capabilities and improve efficiency.
  • Collaborate with cross-functional teams to implement machine learning models and AI-driven analytics solutions using Vertex AI and BigQuery ML.
  • Continuously improve existing data architectures to support scalability, performance optimization, and cost efficiency.
  • Enhance data security and governance by implementing industry best practices for access control, encryption, and compliance.
  • Automate and optimize data workflows to simplify reporting, dashboarding, and self-service analytics using Looker and Data Studio.


Basic Qualifications

  • 7+ years of experience in data engineering, software development, business intelligence, or data science, with expertise in large-scale data processing and analytics.
  • Strong proficiency in SQL and experience with BigQuery for data warehousing.
  • Hands-on experience in designing and developing ETL/ELT pipelines using GCP services (Cloud Composer, Dataflow, Dataproc, Data Fusion, or Apache Airflow).
  • Expertise in distributed computing and big data processing frameworks, such as Apache Spark, Hadoop, or Flink, particularly within Dataproc and Dataflow environments.
  • Experience with business intelligence and data visualization tools, such as Looker, Tableau, or Power BI.
  • Knowledge of data governance, security best practices, and compliance requirements in cloud environments.


Preferred Qualifications:

  • Degree/Diploma in Computer Science, Engineering, Mathematics, or a related technical field.
  • Experience working with GCP big data technologies, including BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud SQL.
  • Hands-on experience with real-time data processing frameworks, including Kafka and Apache Beam.
  • Proficiency in Python, Java, or Scala for data engineering and pipeline development.
  • Familiarity with DevOps best practices, CI/CD pipelines, Terraform, and infrastructure-as-code for managing GCP resources.
  • Experience integrating AI/ML models into data workflows, leveraging BigQuery ML, Vertex AI, or TensorFlow.
  • Understanding of Agile methodologies, software development life cycle (SDLC), and cloud cost optimization strategies.
Read more
Data Axle

at Data Axle

2 candid answers
Eman Khan
Posted by Eman Khan
Pune
9 - 12 yrs
Best in industry
databricks
skill iconPython
PySpark
skill iconMachine Learning (ML)
SQL
+1 more

Roles & Responsibilities:  

We are looking for a Data Scientist to join the Data Science Client Services team to continue our success of identifying high quality target audiences that generate profitable marketing return for our clients. We are looking for experienced data science, machine learning and MLOps practitioners to design, build and deploy impactful predictive marketing solutions that serve a wide range of verticals and clients. The right candidate will enjoy contributing to and learning from a highly talented team and working on a variety of projects.  


We are looking for a Lead Data Scientist who will be responsible for  

  • Ownership of design, implementation, and deployment of machine learning algorithms in a modern Python-based cloud architecture  
  • Design or enhance ML workflows for data ingestion, model design, model inference and scoring 3. Oversight on team project execution and delivery  
  • Establish peer review guidelines for high quality coding to help develop junior team members’ skill set growth, cross-training, and team efficiencies  
  • Visualize and publish model performance results and insights to internal and external audiences  


Qualifications:  

  • Masters in a relevant quantitative, applied field (Statistics, Econometrics, Computer Science, Mathematics, Engineering)  
  • Minimum of 9+ years of work experience in the end-to-end lifecycle of ML model development and deployment into production within a cloud infrastructure (Databricks is highly preferred)
  • Exhibit deep knowledge of core mathematical principles relating to data science and machine learning (ML Theory + Best Practices, Feature Engineering and Selection, Supervised and Unsupervised ML, A/B Testing, etc.)  
  • Proficiency in Python and SQL required; PySpark/Spark experience a plus  
  • Ability to conduct a productive peer review and proper code structure in Github
  • Proven experience developing, testing, and deploying various ML algorithms (neural networks, XGBoost, Bayes, and the like)  
  • Working knowledge of modern CI/CD methods  


This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level. 

Read more
KNS Technologies
Bengaluru (Bangalore)
2 - 3 yrs
₹3.5L - ₹5L / yr
skill iconC#
ASP.NET
ADO.NET
ASP.NET MVC
SQL
+6 more

Job Summary: 

We are looking for a talented Full Stack Developer with experience in C# (ASP.NET Core Web API), React, and SQL Server. The successful candidate will be responsible for designing, developing, and maintaining robust web applications and APIs, ensuring seamless integration between the front-end and back-end systems. 

Key Responsibilities: 

Full Stack Development: Design, develop, and maintain web applications using C#, ASP.NET Core Web API, and React. 

API Development: Create and maintain RESTful APIs to support front-end applications and integrations. 

Database Management: Design, optimize, and manage SQL Server databases, including writing complex queries, stored procedures, and indexing. 

Front-End Development: Implement user interfaces using React, ensuring a smooth and responsive user experience. 

Code Quality: Write clean, scalable, and well-documented code following best practices in software development. 

Collaboration: Work closely with cross-functional teams, including UI/UX designers, back-end developers, and DevOps, to deliver high-quality software solutions. 

Testing & Debugging: Conduct unit testing, integration testing, and debugging to ensure the quality and reliability of applications. 

Continuous Improvement: Stay updated on the latest industry trends and technologies and integrate them into development processes where applicable. 

Required Qualifications: 

Experience: Proven experience as a Full Stack Developer with a strong focus on C#, ASP.NET Core Web API, React, and SQL Server. 

Technical Skills: 

Proficient in C# and ASP.NET Core Web API development. 

Strong experience with React and related front-end technologies (JavaScript, HTML, CSS). 

Expertise in SQL Server, including database design, query optimization, and performance tuning. 

Familiarity with version control systems like Git. 

Understanding of RESTful architecture and Web API design. 

Problem-Solving: Excellent analytical and problem-solving skills with the ability to troubleshoot complex issues. 

Communication: Strong verbal and written communication skills, with the ability to articulate technical concepts to non-technical stakeholders. 

Team Collaboration: Ability to work effectively in a team environment, collaborating with cross-functional teams to achieve project goals. 

Preferred Qualifications: 

Experience with ASP.NET Core MVC or Blazor. 

Knowledge of cloud platforms such as Azure or AWS. 

Experience with Agile/Scrum development methodologies. 

Education: 

Bachelor’s degree in Computer Science, Software Engineering, or a related field (or equivalent experience)

Read more
Cymetrix Software

at Cymetrix Software

2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Remote only
3 - 9 yrs
₹12L - ₹24L / yr
Looker
lookML
bigquery
SQL
Google Cloud Platform (GCP)

Proficient in Looker Action, Looker Dashboarding, Looker Data Entry, LookML, SQL Queries, BigQuery, LookML, Looker Studio, BigQuery, GCP.



Remote Working

2 pm to 12 am IST or

10:30 AM to 7:30 PM IST

Sunday to Thursday



Responsibilities:

● Create and maintain LookML code, which defines data models, dimensions, measures, and relationships within Looker.

● Develop reusable LookML components to ensure consistency and efficiency in report and dashboard creation.

● Build and customize dashboard to Incorporate data visualizations, such as charts and graphs, to present insights effectively.

● Write complex SQL queries when necessary to extract and manipulate data from underlying databases and also optimize SQL queries for performance.

● Connect Looker to various data sources, including databases, data warehouses, and external APIs.

● Identify and address bottlenecks that affect report and dashboard loading times and Optimize Looker performance by tuning queries, caching strategies, and exploring indexing options.

● Configure user roles and permissions within Looker to control access to sensitive data & Implement data security best practices, including row-level and field-level security.

● Develop custom applications or scripts that interact with Looker's API for automation and integration with other tools and systems.

● Use version control systems (e.g., Git) to manage LookML code changes and collaborate with other developers.

● Provide training and support to business users, helping them navigate and use Looker effectively.

● Diagnose and resolve technical issues related to Looker, data models, and reports.


Skills Required:

● Experience in Looker's modeling language, LookML, including data models, dimensions, and measures.

● Strong SQL skills for writing and optimizing database queries across different SQL databases (GCP/BQ preferable)

● Knowledge of data modeling best practices

● Proficient in BigQuery, billing data analysis, GCP billing, unit costing, and invoicing, with the ability to recommend cost optimization strategies.

● Previous experience in Finops engagements is a plus

● Proficiency in ETL processes for data transformation and preparation.

● Ability to create effective data visualizations and reports using Looker’s dashboard tools.

● Ability to optimize Looker performance by fine-tuning queries, caching strategies, and indexing.

● Familiarity with related tools and technologies, such as data warehousing (e.g., BigQuery ), data transformation tools (e.g., Apache Spark), and scripting languages (e.g., Python).

Read more
Remote only
3 - 5 yrs
₹4L - ₹7L / yr
Netsuite
SOAP
Suite Script
SuiteScript2.0
ODBC
+1 more

Job Summary:

SiGa Systems are looking for a skilled and motivated Software Developer with expertise in NetSuite API and ODBC integrations. The ideal candidate will design, develop, and maintain robust data integration solutions to seamlessly move data between NetSuite and external database systems. This role demands a deep understanding of NetSuite’s data model, SuiteTalk APIs, ODBC connectivity, and strong programming skills for data manipulation and integration.

  • Key Responsibilities:1. NetSuite API DevelopmentDesign and implement custom integrations using NetSuite SuiteTalk REST and SOAP APIs.
  • Develop efficient, scalable scripts using SuiteScript 1.0 and 2.x.
  • Build and maintain Suitelets, Scheduled Scripts, User Event Scripts, and other custom NetSuite components.
  • Troubleshoot and resolve issues related to NetSuite API connections and data workflows.
  • 2. ODBC Data IntegrationSet up and manage ODBC connections for accessing NetSuite data.
  • Write complex SQL queries and stored procedures for ETL (Extract, Transform, Load) processes.
  • Design and execute data synchronization workflows between NetSuite and external databases (e.g., SQL Server, MySQL, PostgreSQL).
  • Ensure optimal performance and data accuracy across systems.
  • 3. Data Modeling & Database ManagementAnalyze NetSuite data models and design efficient schemas for target systems.
  • Perform data mapping, transformation, and migration tasks.
  • Ensure data consistency and integrity throughout integration pipelines.
  • Monitor database performance and maintain system reliability.
  • 4. Software Development & DocumentationWrite clean, maintainable, and well-documented code.
  • Participate in code reviews and contribute to coding best practices.
  • Maintain technical documentation, including API specs, integration flows, and data mapping docs.
  • Use version control systems (e.g., Git) for collaboration and code management.
  • 5. Collaboration & CommunicationWork closely with business analysts, project managers, and cross-functional teams to understand integration requirements.
  • Provide technical guidance and regular progress updates to stakeholders.
  • Participate actively in Agile development processes and contribute to sprint planning and retrospectives.


Read more
Peenak Business solutions
Gaurav Kaushik
Posted by Gaurav Kaushik
Bengaluru (Bangalore)
4 - 6 yrs
₹25L - ₹32L / yr
skill iconPython
skill iconNodeJS (Node.js)
skill iconGo Programming (Golang)
SQL
NOSQL Databases

Exp: 4-6 years

Position: Backend Engineer

Job Location: Bangalore ( office near cubbon park - opp JW marriott)

Work Mode : 5 days work from office 


Requirements:

● Engineering graduate with 3-5 years of experience in software product development.

● Proficient in Python, Node.js, Go

● Good knowledge of SQL and NoSQL

● Strong Experience in designing and building APIs

● Experience with working on scalable interactive web applications

● A clear understanding of software design constructs and their implementation

● Understanding of the threading limitations of Python and multi-process architecture

● Experience implementing Unit and Integration testing

● Exposure to the Finance domain is preferred

● Strong written and oral communication skills

Read more
Trellissoft Inc.

at Trellissoft Inc.

3 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
6 - 9 yrs
Upto ₹25L / yr (Varies
)
Data Warehouse (DWH)
ETL
ELT
SQL
skill iconAmazon Web Services (AWS)
+4 more

We’re looking for an experienced Senior Data Engineer to lead the design and development of scalable data solutions at our company. The ideal candidate will have extensive hands-on experience in data warehousing, ETL/ELT architecture, and cloud platforms like AWS, Azure, or GCP. You will work closely with both technical and business teams, mentoring engineers while driving data quality, security, and performance optimization.


Responsibilities:

  • Lead the design of data warehouses, lakes, and ETL workflows.
  • Collaborate with teams to gather requirements and build scalable solutions.
  • Ensure data governance, security, and optimal performance of systems.
  • Mentor junior engineers and drive end-to-end project delivery.

Requirements:

  • 6+ years of experience in data engineering, including at least 2 full-cycle data warehouse projects.
  • Strong skills in SQL, ETL tools (e.g., Pentaho, dbt), and cloud platforms.
  • Expertise in big data tools (e.g., Apache Spark, Kafka).
  • Excellent communication skills and leadership abilities.

Preferred: Experience with workflow orchestration tools (e.g., Airflow), real-time data, and DataOps practices.

Read more
Premier global software products and services firm

Premier global software products and services firm

Agency job
via Recruiting Bond by Pavan Kumar
Hyderabad, Indore, Ahmedabad
9 - 15 yrs
₹25L - ₹38L / yr
skill iconJava
J2EE
Microservices
Apache Kafka
SQL
+17 more

We are in search of a proficient Java Principal Engineer with a minimum of 10 years' experience in designing and developing Java applications. The ideal candidate will demonstrate a deep understanding of Java technologies, including Java EE, Spring Framework, and Hibernate. Proficiency in database technologies such as MySQL, Oracle, or PostgreSQL is essential, along with a proven track record of delivering high-quality, scalable, and efficient Java solutions.



We are looking for you!

You are a team player, get-it-done person, intellectually curious, customer focused, self-motivated, responsible individual who can work under pressure with a positive attitude. You have the zeal to think differently, understand that career is a journey and make the choices right. You must have experience in creating visually compelling designs that effectively communicate our message and engage our target audience. Ideal candidates would be someone who is creative, proactive, go getter and motivated to look for ways to add value to job accomplishments.


As an ideal candidate for the Java Lead position, you bring a wealth of experience and expertise in Java development, combined with strong leadership qualities. Your proven track record showcases your ability to lead and mentor teams to deliver high-quality, enterprise-grade applications.


Your technical proficiency and commitment to excellence make you a valuable asset in driving innovation and success within our development projects. You possess a team-oriented mindset and a "get-it-done" attitude, inspiring your team members to excel and collaborate effectively. 


You have a proven ability to lead mid to large size teams, emphasizing a quality-first approach and ensuring that projects are delivered on time and within scope. As a Java Lead, you are responsible for overseeing project planning, implementing best practices, and driving technical solutions that align with business objectives.


You collaborate closely with development managers, architects, and cross-functional teams to design scalable and robust Java applications.

Your proactive nature and methodical approach enable you to identify areas for improvement, mentor team members, and foster a culture of continuous learning and growth.


Your leadership style, technical acumen, and dedication to delivering excellence make you an ideal candidate to lead our Java development initiatives and contribute significantly to the success and innovation of our organization.


What You Will Do: 

  • Design and development of RESTful Web Services.  
  • Hands on database experience (Oracle / PostgreSQL / MySQL /SQL Server).  
  • Hands on experience with developing web applications leveraging Spring Framework.  
  • Hands on experience with developing microservices leveraging Spring Boot.  
  • Experience with cloud platforms (e.g., AWS, Azure) and containerization technologies.   
  • Continuous Integration tools (Jenkins & Git Lab), CICD Tools. 
  • Strong believer and follower of agile methodologies with an emphasis on Quality & Standards based development. 
  • Architect, design, and implement complex software systems using [Specify relevant technologies, e.g., Java, Python, Node.js. 


What we need?

  • BTech computer science or equivalent  
  • Minimum 10+ years of relevant experience in Java/J2EE technologies  
  • Experience in building back in API using Spring Boot Framework, Spring DI, Spring AOP  
  • Real time messaging integration using Kafka or similar framework  
  • Experience in at least one database: Oracle, SQL server or PostgreSQL
  • Previous experience managing and leading high-performing software engineering teams.   


Why join us?

  • Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
  • Gain hands-on experience in content marketing with exposure to real-world projects.
  • Opportunity to learn from experienced professionals and enhance your marketing skills.
  • Contribute to exciting initiatives and make an impact from day one.
  • Competitive stipend and potential for growth within the company.
  • Recognized for excellence in data and AI solutions with industry awards and accolades.


Read more
Premier global software products and services firm

Premier global software products and services firm

Agency job
via Recruiting Bond by Pavan Kumar
Hyderabad, Ahmedabad, Indore
5 - 10 yrs
₹10L - ₹20L / yr
Data engineering
Data modeling
Database Design
Data Warehouse (DWH)
Datawarehousing
+9 more

Job Summary: 

As a Data Engineering Lead, your role will involve designing, developing, and implementing interactive dashboards and reports using data engineering tools. You will work closely with stakeholders to gather requirements and translate them into effective data visualizations that provide valuable insights. Additionally, you will be responsible for extracting, transforming, and loading data from multiple sources into Power BI, ensuring its accuracy and integrity. Your expertise in Power BI and data analytics will contribute to informed decision-making and support the organization in driving data-centric strategies and initiatives.


We are looking for you!

As an ideal candidate for the Data Engineering Lead position, you embody the qualities of a team player with a relentless get-it-done attitude. Your intellectual curiosity and customer focus drive you to continuously seek new ways to add value to your job accomplishments.


You thrive under pressure, maintaining a positive attitude and understanding that your career is a journey. You are willing to make the right choices to support your growth. In addition to your excellent communication skills, both written and verbal, you have a proven ability to create visually compelling designs using tools like Power BI and Tableau that effectively communicate our core values. 


You build high-performing, scalable, enterprise-grade applications and teams. Your creativity and proactive nature enable you to think differently, find innovative solutions, deliver high-quality outputs, and ensure customers remain referenceable. With over eight years of experience in data engineering, you possess a strong sense of self-motivation and take ownership of your responsibilities. You prefer to work independently with little to no supervision. 


You are process-oriented, adopt a methodical approach, and demonstrate a quality-first mindset. You have led mid to large-size teams and accounts, consistently using constructive feedback mechanisms to improve productivity, accountability, and performance within the team. Your track record showcases your results-driven approach, as you have consistently delivered successful projects with customer case studies published on public platforms. Overall, you possess a unique combination of skills, qualities, and experiences that make you an ideal fit to lead our data engineering team(s).


You value inclusivity and want to join a culture that empowers you to show up as your authentic self. 


You know that success hinges on commitment, our differences make us stronger, and the finish line is always sweeter when the whole team crosses together. In your role, you should be driving the team using data, data, and more data. You will manage multiple teams, oversee agile stories and their statuses, handle escalations and mitigations, plan ahead, identify hiring needs, collaborate with recruitment teams for hiring, enable sales with pre-sales teams, and work closely with development managers/leads for solutioning and delivery statuses, as well as architects for technology research and solutions.


What You Will Do: 

  • Analyze Business Requirements.
  • Analyze the Data Model and do GAP analysis with Business Requirements and Power BI. Design and Model Power BI schema.
  • Transformation of Data in Power BI/SQL/ETL Tool.
  • Create DAX Formula, Reports, and Dashboards. Able to write DAX formulas.
  • Experience writing SQL Queries and stored procedures.
  • Design effective Power BI solutions based on business requirements.
  • Manage a team of Power BI developers and guide their work.
  • Integrate data from various sources into Power BI for analysis.
  • Optimize performance of reports and dashboards for smooth usage.
  • Collaborate with stakeholders to align Power BI projects with goals.
  • Knowledge of Data Warehousing(must), Data Engineering is a plus


What we need?

  • B. Tech computer science or equivalent
  • Minimum 5+ years of relevant experience 


Why join us?

  • Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
  • Gain hands-on experience in content marketing with exposure to real-world projects.
  • Opportunity to learn from experienced professionals and enhance your marketing skills.
  • Contribute to exciting initiatives and make an impact from day one.
  • Competitive stipend and potential for growth within the company.
  • Recognized for excellence in data and AI solutions with industry awards and accolades.


Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Pune
4 - 8 yrs
₹7L - ₹18L / yr
skill iconJava
skill iconJavascript
skill iconHTML/CSS
skill iconPostgreSQL
SQL
+8 more

Key Responsibilities would include: 


1. Design, develop, and maintain enterprise-level Java applications. 

2. Collaborate with cross-functional teams to gather and analyze requirements, and implement solutions. 

3. Develop & customize the application using HTML5, CSS, and jQuery to create dynamic and responsive user interfaces. 

4. Integrate with relational databases (RDBMS) to manage and retrieve data efficiently. 

5. Write clean, maintainable, and efficient code following best practices and coding standards. 

6. Participate in code reviews, debugging, and testing to ensure high-quality deliverables. 

7. Troubleshoot and resolve issues in existing applications and systems. 


Qualification requirement - 


1. 4 years of hands-on experience in Java / J2ee development, preferably with enterprise-level projects.

2. Spring Framework including - SOA, AoP and Spring security 

3. Proficiency in web technologies including HTML5, CSS, jQuery, and JavaScript.

4. Experience with RESTful APIs and web services.

5. Knowledge of build tools like Maven or Gradle

6. Strong knowledge of relational databases (e.g., MySQL, PostgreSQL, Oracle) and experience with SQL.

7. Experience with version control systems like Git.

8. Understanding of software development lifecycle (SDLC) 

9. Strong problem-solving skills and attention to details.

Read more
SaaS Spend Management Platform

SaaS Spend Management Platform

Agency job
via Recruiting Bond by Pavan Kumar
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
1 - 3 yrs
₹4L - ₹7L / yr
skill iconPython
skill iconReact.js
SQL
Fullstack Developer
Large Language Models (LLM)
+14 more

Requirement:

● Role: Fullstack Developer

● Location: Noida (Hybrid)

● Experience: 1-3 years

● Type: Full-Time


Role Description : We’re seeking a Fullstack Developer to join our fast-moving team at Velto. You’ll be responsible for building robust backend services and user-facing features using a modern tech stack. In this role, you’ll also get hands-on exposure to applied AI, contributing to the development of LLM-powered workflows, agentic systems, and custom fi ne-tuning pipelines.


Responsibilities:

● Develop and maintain backend services using Python and FastAPI

● Build interactive frontend components using React

● Work with SQL databases, design schema, and integrate data models with Python

● Integrate and build features on top of LLMs and agent frameworks (e.g., LangChain, OpenAI, HuggingFace)

● Contribute to AI fi ne-tuning pipelines, retrieval-augmented generation (RAG) setups, and contract intelligence workfl ows

● Profi ciency with unit testing libraries like jest, React testing library and pytest.

● Collaborate in agile sprints to deliver high-quality, testable, and scalable code

● Ensure end-to-end performance, security, and reliability of the stack


Required Skills:

● Proficient in Python and experienced with web frameworks like FastAPI

● Strong grasp of JavaScript and React for frontend development

● Solid understanding of SQL and relational database integration with Python

● Exposure to LLMs, vector databases, and AI-based applications (projects, internships, or coursework count)

● Familiar with Git, REST APIs, and modern software development practices

● Bachelor’s degree in Computer Science or equivalent fi eld


Nice to Have:

● Experience working with LangChain, RAG pipelines, or building agentic workfl ows

● Familiarity with containerization (Docker), basic DevOps, or cloud deployment

● Prior project or internship involving AI/ML, NLP, or SaaS products

Why Join Us?

● Work on real-world applications of AI in enterprise SaaS

● Fast-paced, early-stage startup culture with direct ownership

● Learn by doing—no layers, no red tape

● Hybrid work setup and merit-based growth



Read more
Domgys India Services Pvt Ltd
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
2 - 5 yrs
₹1L - ₹6L / yr
skill icon.NET
skill iconJavascript
skill iconPHP
skill iconJava
skill iconNodeJS (Node.js)
+2 more

About the Role:* We are looking for a passionate and motivated *Junior Backend Developer* to join our team. As a backend developer, you’ll work closely with front-end developers, product managers, and other engineers to build scalable and efficient backend systems. This is a great opportunity for someone looking to kickstart their career in backend development, learn best practices, and grow within a collaborative environment.


--- Key Responsibilities:* - Assist in developing server-side logic and APIs using [e.g., Node.js / Python / Java / PHP / Ruby].

- Write clean, maintainable, and efficient code. - Work with databases like MySQL, PostgreSQL, or MongoDB.

- Collaborate with the front-end team to integrate user-facing elements. - Participate in code reviews and contribute to improving development processes.

- Debug and fix issues reported in backend services.

- Stay up-to-date with emerging technologies and development trends. --- Required Skills & Qualifications:*

- Basic understanding of backend development using one or more languages (e.g., Node.js, Python, Java, PHP, etc.)

- Familiarity with RESTful APIs and web services. - Knowledge of databases (SQL or NoSQL).

- Understanding of version control systems like Git. - Willingness to learn, take feedback, and adapt quickly.

- Good problem-solving and communication skills.


--- Preferred (but not required):* - Internship or academic project experience in backend development.

- Exposure to cloud platforms (e.g., AWS, Azure, GCP).

- Familiarity with Docker or containerization concepts.

- Understanding of basic authentication & security practices in backend systems.

Read more
Tech Prescient

at Tech Prescient

2 candid answers
3 recruiters
Ashwini Damle
Posted by Ashwini Damle
Remote, Bengaluru (Bangalore)
8 - 10 yrs
₹20L - ₹35L / yr
skill iconJava
skill iconNodeJS (Node.js)
NOSQL Databases
SQL
skill iconAmazon Web Services (AWS)
+1 more

Job Title- Senior Full Stack Web Developer

Job location- Bangalore/Hybrid

Availability- Immediate Joiners

Experience Range- 5-8yrs

Desired skills - Java,AWS, SQL/NoSQL, Javascript, Node.js(good to have)


We are looking for 8-10 years Senior Full Stack Web Developer Java 



  1. Working on different aspects of the core product and associated tools, (server-side or user-interfaces depending on the team you'll join)
  2. Expertise as a full stack software engineer of large scale complex software systems with at 8+ years of experience with technologies such as Java, Relational and Non relational databases,Node.js and AWS Cloud
  3. Assisting with in-life maintenance, testing, debugging and documentation of deployed services
  4. Coding & designing new features
  5. Creating the supporting functional and technical specifications
  6. Deep understanding of system architecture , and distributed systems
  7. Stay updated with the latest services, tools, and trends, and implement innovative solutions that contribute to the company's growth


Read more
Tech Prescient

at Tech Prescient

2 candid answers
3 recruiters
Ashwini Damle
Posted by Ashwini Damle
Remote, Bengaluru (Bangalore)
8 - 10 yrs
₹15L - ₹35L / yr
skill iconJava
skill iconAmazon Web Services (AWS)
skill iconKubernetes
SQL
skill iconSpring Boot

Job Title- Senior Java Developer

Exp Range- 8-10 yrs

Location- Bangalore/ Hybrid

Desired skill- Java 8, Microservices (Must), AWS, Kafka, Kubernetes


What you will bring:


● Strong core Java, concurrency and server-side experience

● 8 + Years of experience with hands-on coding.

● Strong Java8 and Microservices. (Must)

● Should have good understanding on AWS/GCP

● Kafka, AWS stack/Kubernetes

● An understanding of Object Oriented Design and standard design patterns.

● Experience of multi-threaded, 3-tier architectures/Distributed architectures, web services and caching.

● A familiarity with SQL databases

● Ability and willingness to work in a global, fast-paced environment.


● Flexible with the ability to adapt working style to meet objectives.

● Excellent communication and analytical skills

● Ability to effectively communicate with team members

● Experience in the following technologies would be beneficial but not essential, SpringBoot, AWS, Kubernetes, Terraform, Redis

Read more
Client based at Bangalore location.

Client based at Bangalore location.

Agency job
Remote only
8 - 12 yrs
₹24L - ₹30L / yr
Real World evidence
RWE Analyst
Healthcare
Large Language Models (LLM)
SQL
+10 more

Real-World Evidence (RWE) Analyst

Summary:

As an experienced Real-World Evidence (RWE) Analyst, you will leverage our cutting-edge healthcare data platform (accessing over 60 million lives in Asia, with ambitious growth plans across Africa and the Middle East) to deliver impactful clinical insights to our pharmaceutical clients. You will be involved in the full project lifecycle, from designing analyses to execution and delivery, within our agile data science team. This is an exciting opportunity to contribute significantly to a growing early-stage company focused on improving precision medicine and optimizing patient care for diverse populations.

Responsibilities:

·      Contribute to the design and execution of retrospective and prospective real-world research, including epidemiological and patient outcomes studies.

·      Actively participate in problem-solving discussions by clearly defining issues and proposing effective solutions.

·      Manage the day-to-day progress of assigned workstreams, ensuring seamless collaboration with the data engineering team on analytical requests.

·      Provide timely and clear updates on project status to management and leadership.

·      Conduct in-depth quantitative and qualitative analyses, driven by project objectives and your intellectual curiosity.

·      Ensure the quality and accuracy of analytical outputs, and contextualize findings by reviewing relevant published research.

·      Synthesize complex findings into clear and compelling presentations and written reports (e.g., slides, documents).

·      Contribute to the development of standards and best practices for future RWE analyses.

Requirements:

·      Undergraduate or post-graduate degree (MS or PhD preferred) in a quantitative analytical discipline such as Epidemiology, (Bio)statistics, Data Science, Engineering, Econometrics, or Operations Research.

·      8+ years of relevant work experience demonstrating:

o  Strong analytical and problem-solving capabilities.

o  Experience conducting research relevant to the pharmaceutical/biotech industry.

·      Proficiency in technical skills including SQL and at least one programming language (R, Python, or similar).

·      Solid understanding of the healthcare/medical and pharmaceutical industries.

·      Proven experience in managing workstream or project management activities.

·      Excellent written and verbal communication, and strong interpersonal skills with the ability to build collaborative partnerships.

·      Exceptional attention to detail.

·      Proficiency in Microsoft Office Suite (Excel, PowerPoint, Word).

Other Desirable Skills:

·      Demonstrated dedication to teamwork and the ability to collaborate effectively across different functions.

·      A strong desire to contribute to the growth and development of the RWE analytics function.

·      A proactive and innovative mindset with an entrepreneurial spirit, eager to take on a key role in a dynamic, growing company.

Read more
KeyLogic Infotech
Priyanka Muniwala
Posted by Priyanka Muniwala
Surat
0 - 2 yrs
₹1.8L - ₹4.8L / yr
skill iconJavascript
skill iconNodeJS (Node.js)
skill iconExpress
skill iconMongoDB
RESTful APIs
+1 more

We are looking for a passionate and skilled Node.js Developer to join our dynamic team. If you're excited about building scalable and efficient backend applications using modern technologies, we’d love to hear from you!


Responsibilities

Develop and maintain server-side components using Node.js

Design and build scalable RESTful APIs

Ensure high performance and responsiveness of applications

Implement security and data protection measures

Collaborate with team members to define and deliver high-quality software

Write unit and integration tests to ensure software quality


Required Skills

✅ Strong proficiency in JavaScript

✅ Good understanding of Node.js and Express

✅ Experience with NoSQL databases, especially MongoDB

✅ Familiarity with Git and version control workflows

✅ Ability to write reusable, testable, and efficient code

✅ Experience with JWT and modern authorization mechanisms

✅ Knowledge of security and data protection best practices

✅ Strong analytical and problem-solving skills




Read more
Wissen Technology

at Wissen Technology

4 recruiters
Seema Srivastava
Posted by Seema Srivastava
Mumbai
5 - 10 yrs
Best in industry
skill iconPython
SQL
Databases
Data engineering
skill iconAmazon Web Services (AWS)

Job Description: Data Engineer 

Position Overview:

Role Overview

We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.

 

Key Responsibilities

· Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.

· Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).

· Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.

· Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.

· Ensure data quality and consistency by implementing validation and governance practices.

· Work on data security best practices in compliance with organizational policies and regulations.

· Automate repetitive data engineering tasks using Python scripts and frameworks.

· Leverage CI/CD pipelines for deployment of data workflows on AWS.

 

Required Skills and Qualifications

· Professional Experience: 5+ years of experience in data engineering or a related field.

· Programming: Strong proficiency in Python, with experience in libraries like pandas, pySpark, or boto3.

· AWS Expertise: Hands-on experience with core AWS services for data engineering, such as:

· AWS Glue for ETL/ELT.

· S3 for storage.

· Redshift or Athena for data warehousing and querying.

· Lambda for serverless compute.

· Kinesis or SNS/SQS for data streaming.

· IAM Roles for security.

· Databases: Proficiency in SQL and experience with relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., DynamoDB) databases.

· Data Processing: Knowledge of big data frameworks (e.g., Hadoop, Spark) is a plus.

· DevOps: Familiarity with CI/CD pipelines and tools like Jenkins, Git, and CodePipeline.

· Version Control: Proficient with Git-based workflows.

· Problem Solving: Excellent analytical and debugging skills.

 

Optional Skills

· Knowledge of data modeling and data warehouse design principles.

· Experience with data visualization tools (e.g., Tableau, Power BI).

· Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).

· Exposure to other programming languages like Scala or Java.

 

Education

· Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.

 

Why Join Us?

· Opportunity to work on cutting-edge AWS technologies.

· Collaborative and innovative work environment.

 

 

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Mumbai
2 - 4 yrs
₹8L - ₹13L / yr
skill iconPython
RESTful APIs
SQL
JIRA

Requirements:

  • Must have proficiency in Python
  • At least 3+ years of professional experience in software application development.
  • Good understanding of REST APIs and a solid experience in testing APIs.
  • Should have built APIs at some point and practical knowledge on working with them
  • Must have experience in API testing tools like Postman and in setting up the prerequisites and post-execution validations using these tools
  • Ability to develop applications for test automation
  • Should have worked in a distributed micro-service environment
  • Hands-on experience with Python packages for testing (preferably pytest).
  • Should be able to create fixtures, mock objects and datasets that can be used by tests across different micro-services
  • Proficiency in gitStrong in writing SQL queriesTools like Jira, Asana or similar bug tracking tool, Confluence - Wiki, Jenkins - CI tool
  • Excellent written and oral communication and organisational skills with the ability to work within a growing company with increasing needs
  • Proven track record of ability to handle time-critical projects


Good to have:

  • Good understanding of CI/CDKnowledge of queues, especially Kafka
  • Ability to independently manage test environment deployments and handle issues around itPerformed load testing of API endpoints
  • Should have built an API test automation framework from scratch and maintained it
  • Knowledge of cloud platforms like AWS, Azure
  • Knowledge of different browsers and cross-platform operating systems
  • Knowledge of JavaScript
  • Web Programming, Docker & 3-Tier Architecture Knowledge is preferred.
  • Should have knowlege in API Creation, Coding Experience would be add on.
  • 5+ years experience in test automation using tools like TestNG, Selenium Webdriver (Grid, parallel, SauceLabs), Mocha_Chai front-end and backend test automation
  • Bachelor's degree in Computer Science / IT / Computer Applications


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Seema Srivastava
Posted by Seema Srivastava
Mumbai
5 - 10 yrs
₹15L - ₹30L / yr
SQL
skill iconJava
skill iconSpring Boot

We're Hiring: Java Developers | Mumbai (Hybrid) 🚀

Are you a passionate Java Developer with 5 to 10 years of experience? Here's your chance to take your career to the next level! 💼

We're looking for talented professionals to join an exciting opportunity with a top-tier BFSI domain Project—a true leader in the market. 🏦💻

🔹 Location: Mumbai

🔹 Work Mode: Hybrid

🔹 Experience: 4 to 10 years

🔹 Domain: BFSI

This is more than just a job—it's a chance to work on impactful projects and grow with some of the best minds in the industry. 🌟

👉 If you're interested, please share your updated resume along with the following details:

Total Experience

Current CTC

Expected CTC

Tag your network or apply now—this could be your next big move! 🔄🚀

Read more
Zafin
Agency job
via hirezyai by HR Hirezyai
Thiruvananthapuram
10 - 12 yrs
₹18L - ₹20L / yr
SQL
DAX
ADF
ETL

Founded in 2002, Zafin offers a SaaS product and pricing platform that simplifies core modernization for top banks worldwide. Our platform enables business users to work collaboratively to design and manage pricing, products, and packages, while technologists streamline core banking systems. 

With Zafin, banks accelerate time to market for new products and offers while lowering the cost of change and achieving tangible business and risk outcomes. The Zafin platform increases business agility while enabling personalized pricing and dynamic responses to evolving customer and market needs. 

Zafin is headquartered in Vancouver, Canada, with offices and customers around the globe including ING, CIBC, HSBC, Wells Fargo, PNC, and ANZ. Zafin is proud to be recognized as a top employer and certified Great Place to Work® in Canada, India and the UK. 

 

Job Summary: 

We are looking for a highly skilled and detail-oriented Data & Visualisation Specialist to join our team. The ideal candidate will have a strong background in Business Intelligence (BI), data analysis, and visualisation, with advanced technical expertise in Azure Data Factory (ADF), SQL, Azure Analysis Services, and Power BI. In this role, you will be responsible for performing ETL operations, designing interactive dashboards, and delivering actionable insights to support strategic decision-making. 


Key Responsibilities: 

· Azure Data Factory: Design, build, and manage ETL pipelines in Azure Data Factory to facilitate seamless data integration across systems. 

· SQL & Data Management: Develop and optimize SQL queries for extracting, transforming, and loading data while ensuring data quality and accuracy. 

· Data Transformation & Modelling: Build and maintain data models using Azure Analysis Services (AAS), optimizing for performance and usability. 

· Power BI Development: Create, maintain, and enhance complex Power BI reports and dashboards tailored to business requirements. 

· DAX Expertise: Write and optimize advanced DAX queries and calculations to deliver dynamic and insightful reports. 

· Collaboration: Work closely with stakeholders to gather requirements, deliver insights, and help drive data-informed decision-making across the organization. 

· Attention to Detail: Ensure data consistency and accuracy through rigorous validation and testing processes. o Presentation & Reporting: 

· Effectively communicate insights and updates to stakeholders, delivering clear and concise documentation. 

 

Skills and Qualifications: 


Technical Expertise: 

· Proficient in Azure Data Factory for building ETL pipelines and managing data flows. 

· Strong experience with SQL, including query optimization and data transformation. 

· Knowledge of Azure Analysis Services for data modelling 

· Advanced Power BI skills, including DAX, report development, and data modelling. 

· Familiarity with Microsoft Fabric and Azure Analytics (a plus) 

· Analytical Thinking: Ability to work with complex datasets, identify trends, and tackle ambiguous challenges effectively 


Communication Skills: 

· Excellent verbal and written communication skills, with the ability to convey complex technical information to non-technical stakeholders. 

· Educational Qualification: Minimum of a Bachelor's degree, preferably in a quantitative field such as Mathematics, Statistics, Computer Science, Engineering, or a related discipline 


What’s in it for you 

Joining our team means being part of a culture that values diversity, teamwork, and high-quality work. We offer competitive salaries, annual bonus potential, generous paid time off, paid volunteering days, wellness benefits, and robust opportunities for professional growth and career advancement.


Read more
NA

NA

Agency job
via Method Hub by Sampreetha Pai
anywhere in India
4 - 5 yrs
₹18L - ₹22L / yr
SQL Azure
Apache Spark
DevOps
PySpark
skill iconPython
+1 more

Azure DE

Primary Responsibilities -

  • Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage.
  • Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure Create data models for analytics purposes
  • Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations
  • Use Azure Data Factory and Databricks to assemble large, complex data sets
  • Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data.
  • Ensure data security and compliance
  • Collaborate with data engineers, and other stakeholders to understand requirements and translate them into scalable and reliable data platform architectures

Required skills:

  • Blend of technical expertise, analytical problem-solving, and collaboration with cross-functional teams
  • Azure DevOps
  • Apache Spark, Python
  • SQL proficiency
  • Azure Databricks knowledge
  • Big data technologies


The DEs should be well versed in coding, spark core and data ingestion using Azure. Moreover, they need to be decent in terms of communication skills. They should also have core Azure DE skills and coding skills (pyspark, python and SQL).

Out of the 7 open demands, 5 of The Azure Data Engineers should have minimum 5 years of relevant Data Engineering experience.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Pune, Ahmedabad
4 - 9 yrs
₹10L - ₹35L / yr
skill iconPython
pytest
skill iconAmazon Web Services (AWS)
Test Automation (QA)
SQL

At least 5 years of experience in testing and developing automation tests.

A minimum of 3 years of experience writing tests in Python, with a preference for experience in designing automation frameworks.

Experience in developing automation for big data testing, including data ingestion, data processing, and data migration, is highly desirable.

Familiarity with Playwright or other browser application testing frameworks is a significant advantage.

Proficiency in object-oriented programming and principles is required.

Extensive knowledge of AWS services is essential.

Strong expertise in REST API testing and SQL is required.

A solid understanding of testing and development life cycle methodologies is necessary.

Knowledge of the financial industry and trading systems is a plus

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort