50+ SQL Jobs in India
Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Required Skills and Qualifications :
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Proven experience as a Data Modeler or in a similar role at a asset manager or financial firm.
- Strong Understanding of various business concepts related to buy side financial firms. Understanding of Private Markets (Private Credit, Private Equity, Real Estate, Alternatives) is required.
- Strong understanding of database design principles and data modeling techniques (e.g., ER modeling, dimensional modeling).
- Knowledge of SQL and experience with relational databases (e.g., Oracle, SQL Server, MySQL).
- Familiarity with NoSQL databases is a plus.
- Excellent analytical and problem-solving skills.
- Strong communication skills and the ability to work collaboratively.
Preferred Qualifications:
- Experience in data warehousing and business intelligence.
- Knowledge of data governance practices.
- Certification in data modeling or related fields.
Key Responsibilities :
- Design and develop conceptual, logical, and physical data models based on business requirements.
- Collaborate with stakeholders in finance, operations, risk, legal, compliance and front offices to gather and analyze data requirements.
- Ensure data models adhere to best practices for data integrity, performance, and security.
- Create and maintain documentation for data models, including data dictionaries and metadata.
- Conduct data profiling and analysis to identify data quality issues.
- Conduct detailed meetings and discussions with business to translate broad business functionality requirements into data concepts, data models and data products.
We are looking for an experienced Java Support Engineer with 4+ years of hands-on experience in supporting and maintaining Java/Spring Boot-based applications. The ideal candidate will be responsible for production support, debugging issues, and ensuring smooth application performance.
Key Responsibilities:
- Provide L2/L3 support for Java/Spring Boot applications in production and non-production environments.
- Perform incident analysis, root cause identification, and apply quick fixes or permanent solutions.
- Handle application deployments, environment monitoring, and performance tuning.
- Collaborate with development, DevOps, and database teams to resolve technical issues.
- Write and debug SQL queries, manage data fixes, and ensure database integrity.
- Use monitoring tools like Splunk, Kibana, or ELK Stack for issue investigation.
- Prepare documentation for recurring issues and maintain knowledge base.
Technical Skills:
- Strong in Core Java, Spring Boot, RESTful APIs
- Good knowledge of SQL / PL-SQL (Oracle / MySQL / PostgreSQL)
- Familiar with Linux/Unix commands and Shell scripting
- Exposure to microservices architecture and CI/CD tools (Jenkins, Maven)
- Hands-on experience with application monitoring and log analysis tools
- Knowledge of cloud (AWS / Azure) environments is a plus
Soft Skills:
- Strong problem-solving and analytical mindset
- Good communication and teamwork skills
- Ability to work under pressure and handle on-call support if required


Required Skills:
- 4+ years of experience designing, developing, and implementing enterprise-level, n-tier, software solutions.
- Proficiency with Microsoft C# is a must.
- In-depth experience with .NET framework and .NET Core.
- Knowledge of OOP, server technologies, and SOA is a must. 3+ Years Micro-service experience .
- Relevant experience with database design and SQL (Postgres is preferred).
- Experience with ORM tooling.
- Experience delivering software that is correct, stable, and security compliant.
- Basic understanding of common cloud platform. (Good to have)
- Financial services experience is strongly preferred.
- Thorough understanding of XML/JSON and related technologies.
- Thorough understanding of unit, integration, and performance testing for APIs.
- Entrepreneurial spirit. You are self-directed, innovative, and biased towards action. You love to build new things and thrive in fast-paced environments.
- Excellent communication and interpersonal skills, with an emphasis on strong writing and analytical problem-solving.

Wissen Technology is hiring for Data Engineer
About Wissen Technology:At Wissen Technology, we deliver niche, custom-built products that solve complex business challenges across industries worldwide. Founded in 2015, our core philosophy is built around a strong product engineering mindset—ensuring every solution is architected and delivered right the first time. Today, Wissen Technology has a global footprint with 2000+ employees across offices in the US, UK, UAE, India, and Australia. Our commitment to excellence translates into delivering 2X impact compared to traditional service providers. How do we achieve this? Through a combination of deep domain knowledge, cutting-edge technology expertise, and a relentless focus on quality. We don’t just meet expectations—we exceed them by ensuring faster time-to-market, reduced rework, and greater alignment with client objectives. We have a proven track record of building mission-critical systems across industries, including financial services, healthcare, retail, manufacturing, and more. Wissen stands apart through its unique delivery models. Our outcome-based projects ensure predictable costs and timelines, while our agile pods provide clients the flexibility to adapt to their evolving business needs. Wissen leverages its thought leadership and technology prowess to drive superior business outcomes. Our success is powered by top-tier talent. Our mission is clear: to be the partner of choice for building world-class custom products that deliver exceptional impact—the first time, every time.
Job Summary:Wissen Technology is hiring a Data Engineer with a strong background in Python, data engineering, and workflow optimization. The ideal candidate will have experience with Delta Tables, Parquet, and be proficient in Pandas and PySpark.
Experience:7+ years
Location:Pune, Mumbai, Bangalore
Mode of Work:Hybrid
Key Responsibilities:
- Develop and maintain data pipelines using Python (Pandas, PySpark).
- Optimize data workflows and ensure efficient data processing.
- Work with Delta Tables and Parquet for data storage and management.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Ensure data quality and integrity throughout the data lifecycle.
- Implement best practices for data engineering and workflow optimization.
Qualifications and Required Skills:
- Proficiency in Python, specifically with Pandas and PySpark.
- Strong experience in data engineering and workflow optimization.
- Knowledge of Delta Tables and Parquet.
- Excellent problem-solving skills and attention to detail.
- Ability to work collaboratively in a team environment.
- Strong communication skills.
Good to Have Skills:
- Experience with Databricks.
- Knowledge of Apache Spark, DBT, and Airflow.
- Advanced Pandas optimizations.
- Familiarity with PyTest/DBT testing frameworks.
Wissen Sites:
- Website: http://www.wissen.com
- LinkedIn: https://www.linkedin.com/company/wissen-technology
- Wissen Leadership: https://www.wissen.com/company/leadership-team/
- Wissen Live: https://www.linkedin.com/company/wissen-technology/posts/feedView=All
- Wissen Thought Leadership: https://www.wissen.com/articles/
Wissen | Driving Digital Transformation
A technology consultancy that drives digital innovation by connecting strategy and execution, helping global clients to strengthen their core technology.

Key Responsibilities:
Design, build and maintain scalable ETL/ELT pipelines using Azure Data Factory, Azure Databricks and Spark.
Develop and optimize data workflows using SQL and Python/Scala for large-scale processing.
Implement performance tuning and optimization strategies for pipelines and Spark jobs.
Support feature engineering and model deployment workflows with data engineering teams.
Ensure data quality, validation, error-handling and monitoring are in place. Work with Delta Lake, Parquet and Big Data storage (ADLS / Blob).
Required Skills:
Azure Data Platform: Data Factory, Databricks, ADLS / Blob Storage.
Strong SQL and Python or Scala.
Big Data technologies: Spark, Delta Lake, Parquet. ETL/ELT pipeline design and data transformation expertise.
Data pipeline optimization, performance tuning and CI/CD for data workloads.
Nice-to-Have Familiarity with data governance, security and compliance in hybrid environments.
Now Hiring: Tableau Developer (Banking Domain) 🚀
We’re looking for a 6+ years experienced Tableau pro to design and optimize dashboards for Banking & Financial Services.
🔹 Design & optimize interactive Tableau dashboards for large banking datasets
🔹 Translate KPIs into scalable reporting solutions
🔹 Ensure compliance with regulations like KYC, AML, Basel III, PCI-DSS
🔹 Collaborate with business analysts, data engineers, and banking experts
🔹 Bring deep knowledge of SQL, data modeling, and performance optimization
🌍 Location: Remote
📊 Domain Expertise: Banking / Financial Services
✨ Preferred experience with cloud data platforms (AWS, Azure, GCP) & certifications in Tableau are a big plus!
Bring your data visualization skills to transform banking intelligence & compliance reporting.
⚠️ Important Note (Please Read Before Applying):
- Only candidates with 5–8 years of relevant experience should apply.
- Freshers or candidates with less than 5 years of experience – please do not apply.
- Only immediate joiners or candidates currently serving notice will be considered.
- Strict NO for 30 / 60 / 90 days notice period.
Job Title: Java Developer (5–8 Years)
Location: Bangalore (Hybrid Mode)
Experience: 5 to 8 years
Joining: Immediate / Notice Serving Only
About the Role:
We are looking for passionate and highly skilled Java Developers to join our dynamic team in Bangalore. The ideal candidate will have strong expertise in Java, Spring Boot, Collections, Multithreading, and Data Structures & Algorithms (DSA), with proven problem-solving abilities.
Key Responsibilities:
- Design, develop, and maintain high-performance, scalable, and secure applications.
- Work with Spring Boot and related frameworks to build microservices-based solutions.
- Optimize code using Collections & Multithreading concepts for performance and reliability.
- Apply strong DSA and problem-solving skills to deliver efficient solutions.
- Collaborate with cross-functional teams to ensure timely delivery of high-quality software.
- Troubleshoot, debug, and resolve production issues efficiently.
Required Skills & Experience:
- 5–8 years of hands-on experience in Core Java and Spring Boot.
- Strong expertise in Collections, Multithreading, and Concurrency.
- Solid understanding of DSA, Algorithms, and System Design fundamentals.
- Experience in developing REST APIs and Microservices.
- Proficiency in writing clean, maintainable, and efficient code
- .Strong analytical and problem-solving skills.
Why Join Us?
- Opportunity to work on cutting-edge projects with modern architectures.
- Hybrid work setup in Bangalore.
- Fast-paced and growth-driven environment.


Company Description:
NonStop io Technologies, founded in August 2015, is a Bespoke Engineering Studio specializing in Product Development. With over 80 satisfied clients worldwide, we serve startups and enterprises across prominent technology hubs, including San Francisco, New York, Houston, Seattle, London, Pune, and Tokyo. Our experienced team brings over 10 years of expertise in building web and mobile products across multiple industries. Our work is grounded in empathy, creativity, collaboration, and clean code, striving to build products that matter and foster an environment of accountability and collaboration.
Brief Description:
NonStop io is seeking a proficient .NET Developer to join our growing team. You will be responsible for developing, enhancing, and maintaining scalable applications using .NET technologies. This role involves working on a healthcare-focused product and requires strong problem-solving skills, attention to detail, and a passion for software development.
Responsibilities:
- Design, develop, and maintain applications using .NET Core/.NET Framework, C#, and related technologies
- Write clean, scalable, and efficient code while following best practices
- Develop and optimize APIs and microservices
- Work with SQL Server and other databases to ensure high performance and reliability
- Collaborate with cross-functional teams, including UI/UX designers, QA, and DevOps
- Participate in code reviews and provide constructive feedback
- Troubleshoot, debug, and enhance existing applications
- Ensure compliance with security and performance standards, especially for healthcare-related applications
Qualifications & Skills:
- Strong experience in .NET Core/.NET Framework and C#
- Proficiency in building RESTful APIs and microservices architecture
- Experience with Entity Framework, LINQ, and SQL Server
- Familiarity with front-end technologies like React, Angular, or Blazor is a plus
- Knowledge of cloud services (Azure/AWS) is a plus
- Experience with version control (Git) and CI/CD pipelines
- Strong understanding of object-oriented programming (OOP) and design patterns
- Prior experience in healthcare tech or working with HIPAA-compliant systems is a plus
Why Join Us?
- Opportunity to work on a cutting-edge healthcare product
- A collaborative and learning-driven environment
- Exposure to AI and software engineering innovations
- Excellent work ethics and culture
If you're passionate about technology and want to work on impactful projects, we'd love to hear from you!

Data Engineer
Experience: 4–6 years
Key Responsibilities
- Design, build, and maintain scalable data pipelines and workflows.
- Manage and optimize cloud-native data platforms on Azure with Databricks and Apache Spark (1–2 years).
- Implement CI/CD workflows and monitor data pipelines for performance, reliability, and accuracy.
- Work with relational databases (Sybase, DB2, Snowflake, PostgreSQL, SQL Server) and ensure efficient SQL query performance.
- Apply data warehousing concepts including dimensional modelling, star schema, data vault modelling, Kimball and Inmon methodologies, and data lake design.
- Develop and maintain ETL/ELT pipelines using open-source frameworks such as Apache Spark and Apache Airflow.
- Integrate and process data streams from message queues and streaming platforms (Kafka, RabbitMQ).
- Collaborate with cross-functional teams in a geographically distributed setup.
- Leverage Jupyter notebooks for data exploration, analysis, and visualization.
Required Skills
- 4+ years of experience in data engineering or a related field.
- Strong programming skills in Python with experience in Pandas, NumPy, Flask.
- Hands-on experience with pipeline monitoring and CI/CD workflows.
- Proficiency in SQL and relational databases.
- Familiarity with Git for version control.
- Strong communication and collaboration skills with ability to work independently.


About the company
We are the most trusted provider of data collection and management, marketing program management, and analytical solutions for our Crop and Animal Health industry clients. With data services at the core—surrounded by an extensible array of streamlined software solutions—our unified platform represents over three decades of innovation and expertise in the agriculture, crop protection, specialty chemical and animal health industries.
Backed by an entrepreneurial, creative and energetic work force, teammates at AGDATA are pushing the boundaries of technology to enhance our relationships with our clients. We are a growing team, focused on adding creative, knowledgeable individuals who are ready to jump right in and make an immediate impact.
- 30+ years of experience in the Crop and Animal Health industry
- More than 20 billion USD sales processed annually
- Over 2,15,000 payments issued via marketing programs yearly
What’s the role?
If you are looking for an opportunity to solve deep technical problems, build innovative solutions, and work with top-notch software developers in the Pune area, AGDATA might have the role for you.
You must be able to look at the big picture from both business and technology perspective, possess strong analytical, design, and problem-solving skills, and enjoy working with data and algorithms.
You are not afraid of ambiguity, dealing with nebulous requirements, and get excited about difficult challenges.
Our ideal candidate will have...
- 7+ years of software development experience with emphasis on web technologies, cloud computing (Azure preferred), and SaaS
- Deep hands-on experience in Microsoft technologies stack such as .Net 6+, C# (strong knowledge of collections, async await patterns), Web API, windows services, and relational database (MSSQL)
- Proven experience on front end technologies like Angular
- Expertise in RESTful API, SOA, Microservice, AMQP and distributed architecture and design
- Ability to understand complex data relationships
- Experience in Unit Testing
- Experience in Azure cloud services/ Azure DevOps
- Demonstrated skill in aligning application decisions to an overarching solution and systems architecture
- Structured thinker, effective communicator, with excellent programming and analytic skills
In this role, you will ...
- Take your problem-solving skills and expertise in system design to the next level by delivering innovative solutions
- Actively contribute to the development process by writing high-quality code
- Utilize your full stack development skills and work with diverse technologies to deliver outstanding results
- Adapt quickly to new technologies and leverage your past experiences to stay ahead
- Exhibit a passion for building software and delivering high-quality products, prioritizing user experience
- Engage in all phases of the software development life cycle, including design, implementation, and unit testing
- Think from the perspective of our customers, optimizing their experience with our software
How AGDATA will support you:
Supporting your health & well-being:
- Comprehensive medical coverage – up to INR 7.5 lakh for employee and dependents, including parents
- OPD benefit – coverage of up to INR 15 thousand covering expenses across specialties
- Paternity leave up to 14 working days with the option to split leave
Emphasizing work life balance: Flexible hybrid work policy
Experiencing a work culture that promotes from within: In 2023, 14% of our associates were promoted internally
Being comfortable in the office: Coming into our brand-new office space? Free snacks and top class facilities will be available
AccioJob is conducting a Walk-In Hiring Drive with HummingBird Technologies for the position of Java Backend Developer.
To apply, register and select your slot here: https://go.acciojob.com/wNrG3R
Required Skills: DSA, OOPs, SQL, Rest API
Eligibility:
- Degree: BTech./BE, MTech./ME, BCA, MCA, BSc., MSc
- Branch: Computer Science/CSE/Other CS related branch, Electrical/Other electrical related branches, IT
- Graduation Year: 2026
Work Details:
- Work Location: Pune (Onsite)
- CTC: 9 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Pune Centre
Further Rounds (for shortlisted candidates only):
- Profile Evaluation, Technical Interview 1
- Technical Interview 2
- HR Discussion
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/wNrG3R
AccioJob is conducting a Walk-In Hiring Drive with HummingBird Technologies for the position of Java Backend Developer.
To apply, register and select your slot here: https://go.acciojob.com/gqHtdK
Required Skills: DSA, OOPs, SQL, Rest API
Eligibility:
- Degree: BTech./BE, MTech./ME, BCA, MCA, BSc., MSc
- Branch: Computer Science/CSE/Other CS related branch, Electrical/Other electrical related branches, IT
- Graduation Year: 2024, 2025
Work Details:
- Work Location: Pune (Onsite)
- CTC: 9 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Pune Centre
Further Rounds (for shortlisted candidates only):
- Profile Evaluation, Technical Interview 1
- Technical Interview 2
- HR Discussion
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/gqHtdK
- 8+ years of Data Engineering experience
- Strong SQL and Redshift experience
- CI/CD and orchestration experience using Bitbucket, Jenkins and Control-M
- Reporting experience preferably Tableau
- Location – Pune, Hyderabad, Bengaluru
Job Description – Java Developer
Role: Java Developer
Location: Pune / Mumbai
Experience: 5 to 10 years
Required Skills:
We are looking for an experienced Java Developer with strong expertise in Core Java, Spring, Spring Boot, and Hibernate. The candidate should have solid experience in designing, developing, and deploying enterprise-grade applications, with strong understanding of OOPs concepts, data structures, and algorithms. Hands-on experience with RESTful APIs, Microservices, and Database technologies (MySQL/Oracle/SQL Server) is essential.
The ideal candidate should be well-versed in version control systems (Git), build tools (Maven/Gradle), and CI/CD pipelines (Jenkins). Exposure to cloud platforms (AWS/Azure/GCP) and containerization (Docker/Kubernetes) will be a strong plus.
Key Responsibilities:
- Design, develop, and maintain scalable and high-performance applications.
- Write clean, reusable, and efficient code following best practices.
- Collaborate with cross-functional teams to deliver quality solutions.
- Perform code reviews, debugging, and performance tuning.
- Ensure application security, reliability, and scalability.
Good To Have Skills:
- Knowledge of front-end technologies (JavaScript, Angular/React).
- Familiarity with Agile/Scrum methodologies.
- Strong problem-solving and analytical skills.


Job Title: Data Engineering Support Engineer / Manager
Experience range:-8+ Years
Location:- Mumbai
Experience :
Knowledge, Skills and Abilities
- Python, SQL
- Familiarity with data engineering
- Experience with AWS data and analytics services or similar cloud vendor services
- Strong problem solving and communication skills
- Ablity to organise and prioritise work effectively
Key Responsibilities
- Incident and user management for data and analytics platform
- Development and maintenance of Data Quliaty framework (including anomaly detection)
- Implemenation of Python & SQL hotfixes and working with data engineers on more complex issues
- Diagnostic tools implementation and automation of operational processes
Key Relationships
- Work closely with data scientists, data engineers, and platform engineers in a highly commercial environment
- Support research analysts and traders with issue resolution

Location: Krishnagiri, Tamil Nadu
Experience: Minimum 2 years
Job Type: Full-time
Preferred Candidate: Female
About the Role:
We are seeking a dedicated and enthusiastic Computer Teacher to join our academic team. The ideal candidate will have at least 2 years of teaching experience, strong communication skills, and a passion for imparting computer knowledge to students at the school/college level.
Key Responsibilities:
- Deliver computer science curriculum to students as per academic guidelines.
- Teach foundational topics such as MS Office, Internet, HTML, and basic programming (e.g., Scratch, Python, C – as applicable).
- Plan and execute interactive lessons using digital teaching tools.
- Conduct practical sessions in the computer lab.
- Assess students’ progress through assignments, tests, and projects.
- Maintain attendance, grades, and student performance records.
- Encourage students to participate in tech-based activities and competitions.
- Collaborate with school/college staff for curriculum planning and development.
- Provide basic technical support for classroom technology when needed.
Required Qualifications:
- Bachelor’s degree in Computer Science, BCA, or any relevant discipline.
- B.Ed (preferred for school teaching roles).
- Minimum 2 years of teaching experience in a school or academic institution.
- Good command of English and Tamil (or local language as required).
- Strong classroom management and communication skills.
Preferred Qualities:
- Female candidates are preferred for this role.
- Ability to adapt teaching methods based on student needs.
- Familiarity with smart classroom tools and e-learning platforms.
- Passion for education and mentoring young minds.
Working Hours:
- Monday , Tuesday, Thursday to Saturday
- Timings: 9:00 AM to 4:00 PM
Salary Range:
₹15,000 – ₹20,000/month (based on experience and qualification)



Pay: ₹70,000.00 - ₹90,000.00 per month
Job description:
Name of the College: KGiSL Institute of Technology
College Profile: The main objective of KGiSL Institute of Technology is to provide industry embedded education and to mold the students for leadership in industry, government, and educational institutions; to advance the knowledge base of the engineering professions; and to influence the future directions of engineering education and practice. The ability to connect to the future challenges and deliver industry-ready human resources is a credibility that KGISL Educational Institutions have progressively excelled at. Industry -readiness of its students is what will eventually elevate an institution to star status and its competitiveness in the job market. Choice of such an institution will depend on its proximity to industry, the relevance of its learning programme to real-time industry and the active connect that a student will have with industry professionals.
Job Title: Assistant Professor / Associate Professor
Departments:
● CSE
Qualification:
● ME/M.Tech/Ph.D(Ph.D must for Associate Professor)
Experience:
● Freshers can Apply● Experience - 8-10 Years
Key Responsibilities:
1. Teaching & Learning:
Deliver high-quality lectures and laboratory sessions in core and advanced areas of Computer Science & Engineering.
Prepare lesson plans, teaching materials, and assessment tools as per the approved curriculum.
Adopt innovative teaching methodologies, including ICT-enabled learning and outcome-based education (OBE).
2. Research & Publications:
Conduct independent and collaborative research in areas of specialization.
Publish research papers in peer-reviewed journals and present in reputed conferences.
Eligibility & Qualifications (As per AICTE/UGC Norms):
Educational Qualification: Ph.D. in Computer Science & Engineering or relevant discipline.
Experience: Minimum of 8 years teaching/research/industry experience, with at least 3 years at the level of Assistant Professor.
Research: Minimum of 7 publications in refereed journals as per UGC-CARE list and at least one Ph.D. degree awarded or ongoing under supervision.
Other Requirements:
Good academic record throughout.
Proven ability to attract research funding.
Strong communication and interpersonal skills.
Work Location: [ KGiSL Campus]
Employment Type: Full-time / Permanent
Joining time: immediately
Job Type: Full-time
Benefits:
- Health insurance
- Life insurance
- Provident Fund
Work Location: In person

Responsibilities • Design, develop, and maintain backend systems and RESTful APIs using Python (Django, FastAPI, or Flask)• Build real-time communication features using WebSockets, SSE, and async IO • Implement event-driven architectures using messaging systems like Kafka, RabbitMQ, Redis Streams, or NATS • Develop and maintain microservices that interact over messaging and streaming protocols • Ensure high scalability and availability of backend services • Collaborate with frontend developers, DevOps engineers, and product managers to deliver end-to-end solutions • Write clean, maintainable code with unit/integration tests • Lead technical discussions, review code, and mentor junior engineers
Requirements • 8+ years of backend development experience, with at least 8 years in Python • Strong experience with asynchronous programming in Python (e.g., asyncio, aiohttp, FastAPI) • Production experience with WebSockets and Server-Sent Events • Hands-on experience with at least one messaging system: Kafka, RabbitMQ, Redis Pub/Sub, or similar • Proficient in RESTful API design and microservices architecture • Solid experience with relational and NoSQL databases • Familiarity with Docker and container-based deployment • Strong understanding of API security, authentication, and performance optimization
Nice to Have • Experience with GraphQL or gRPC • Familiarity with stream processing frameworks (e.g., Apache Flink, Spark Streaming) • Cloud experience (AWS, GCP, Azure), particularly with managed messaging or pub/sub services • Knowledge of CI/CD and infrastructure as code • Exposure to AI engineering workflows and tools • Interest or experience in building Agentic AI systems or integrating backends with AI agents
Job Description
3-5 years of hands-on experience in manual testing involving functional, non-functional, regression, and integration testing in a structured environment.
Candidate should have exceptional communication skills.
Should have minimum 1 year work experience in data comparison testing.
Experience in testing web-based applications.
Able to define the scope of testing.
Experience in testing large-scale solutions integrating multiple source and target systems.
Experience in API testing.
Experience in Database verification using SQL queries.
Experience working in an Agile team.
Should be able to attend Agile ceremonies in UK hours.
Having a good understanding of Data Migration projects will be a plus.

What We’re Looking For:
- Strong experience in Python (5+ years).
- Hands-on experience with any database (SQL or NoSQL).
- Experience with frameworks like Flask, FastAPI, or Django.
- Knowledge of ORMs, API development, and unit testing.
- Familiarity with Git and Agile methodologies.
- Familiarity with the Kafka tool (Added Advantage)

What We’re Looking For:
- Strong experience in Python (4+ years).
- Hands-on experience with any database (SQL or NoSQL).
- Experience with frameworks like Flask, FastAPI, or Django.
- Knowledge of ORMs, API development, and unit testing.
- Familiarity with Git and Agile methodologies.
- Familiarity with the Kafka tool (Added Advantage)
Job Summary:
We are looking for a skilled and motivated Backend Engineer with 2 to 4 years of professional experience to join our dynamic engineering team. You will play a key role in designing, building, and maintaining the backend systems that power our products. You’ll work closely with cross-functional teams to deliver scalable, secure, and high-performance solutions that align with business and user needs.
This role is ideal for engineers ready to take ownership of systems, contribute to architectural decisions, and solve complex backend challenges.
Website: https://www.thealteroffice.com/about
Key Responsibilities:
- Design, build, and maintain robust backend systems and APIs that are scalable and maintainable.
- Collaborate with product, frontend, and DevOps teams to deliver seamless, end-to-end solutions.
- Model and manage data using SQL (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Redis), incorporating caching where needed.
- Implement and manage authentication, authorization, and data security practices.
- Write clean, well-documented, and well-tested code following best practices.
- Work with cloud platforms (AWS, GCP, or Azure) to deploy, monitor, and scale services effectively.
- Use tools like Docker (and optionally Kubernetes) for containerization and orchestration of backend services.
- Maintain and improve CI/CD pipelines for faster and safer deployments.
- Monitor and debug production issues, using observability tools (e.g., Prometheus, Grafana, ELK) for root cause analysis.
- Participate in code reviews, contribute to improving development standards, and provide support to less experienced engineers.
- Work with event-driven or microservices-based architecture, and optionally use technologies like GraphQL, WebSockets, or message brokers such as Kafka or RabbitMQ when suitable for the solution.
Requirements:
- 2 to 4 years of professional experience as a Backend Engineer or similar role.
- Proficiency in at least one backend programming language (e.g., Python, Java, Go, Ruby, etc.).
- Strong understanding of RESTful API design, asynchronous programming, and scalable architecture patterns.
- Solid experience with both relational and NoSQL databases, including designing and optimizing data models.
- Familiarity with Docker, Git, and modern CI/CD workflows.
- Hands-on experience with cloud infrastructure and deployment processes (AWS, GCP, or Azure).
- Exposure to monitoring, logging, and performance profiling practices in production environments.
- A good understanding of security best practices in backend systems.
- Strong problem-solving, debugging, and communication skills.
- Comfortable working in a fast-paced, agile environment with evolving priorities.


Role Overview:
We are seeking a highly skilled and experienced Lead Web App Developer - Backend to join our dynamic team in Bengaluru. The ideal candidate will have a strong background in backend development, microservices architecture, and cloud technologies, with a proven ability to deliver robust, scalable solutions. This role involves designing, implementing, and maintaining complex distributed systems, primarily in the Mortgage Finance domain.
Key Responsibilities:
- Cloud-Based Web Applications Development:
- Lead backend development efforts for cloud-based web applications.
- Work on diverse projects within the Mortgage Finance domain.
- Microservices Design & Development:
- Design and implement microservices-based architectures.
- Ensure scalability, availability, and reliability of distributed systems.
- Programming & API Development:
- Write efficient, reusable, and maintainable code in Python, Node.js, and Java.
- Develop and optimize RESTful APIs.
- Infrastructure Management:
- Leverage AWS platform infrastructure to build secure and scalable solutions.
- Utilize tools like Docker for containerization and deployment.
- Database Management:
- Work with RDBMS (MySQL) and NoSQL databases to design efficient schemas and optimize queries.
- Team Collaboration:
- Collaborate with cross-functional teams to ensure seamless integration and delivery of projects.
- Mentor junior developers and contribute to the overall skill development of the team.
Core Requirements:
- Experience: Minimum 10+ years in backend development, with at least 3+ years of experience in designing and delivering large-scale products on microservices architecture.
- Technical Skills:
- Programming Languages: Python, Node.js, Java.
- Frameworks & Tools: AWS (Lambda, RDS, etc.), Docker.
- Database Expertise: Proficiency in RDBMS (MySQL) and NoSQL databases.
- API Development: Hands-on experience in developing REST APIs.
- System Design: Strong understanding of distributed systems, scalability, and availability.
Additional Skills (Preferred):
- Experience with modern frontend frameworks like React.js or AngularJS.
- Strong design and architecture capabilities.
What We Offer:
- Opportunity to work on cutting-edge technologies in a collaborative environment.
- Competitive salary and benefits package.
- Flexible hybrid working model.
- Chance to contribute to impactful projects in the Mortgage Finance domain.

Role Overview
We are looking for a highly skilled Product Engineer to join our dynamic team. This is an exciting opportunity to work on innovative FinTech solutions and contribute to the future of global payments. If you're passionate about backend development, API design, and scalable architecture, we'd love to hear from you!
Key Responsibilities
- Design, develop, and maintain scalable, high-performance backend systems.
- Write clean, maintainable, and efficient code while following best practices.
- Build and optimize RESTful APIs and database queries.
- Collaborate with cross-functional teams to deliver 0 to 1 products.
- Ensure smooth CI/CD pipeline implementation and deployment automation.
- Contribute to open-source projects and stay updated with industry trends.
- Maintain a strong focus on security, performance, and reliability.
- Work with payment protocols and financial regulations to ensure compliance.
Required Skills & Qualifications
- ✅ 3+ years of professional software development experience.
- ✅ Proficiency in any backend language (with preference for Ruby on Rails).
- ✅ Strong foundation in architecture, design, and database optimization.
- ✅ Experience in building APIs and working with SQL/NoSQL databases.
- ✅ Familiarity with CI/CD practices and automation tools.
- ✅ Excellent problem-solving and analytical skills.
- ✅ Strong track record of open-source contributions (minimum 50 stars on GitHub).
- ✅ Passion for FinTech and payment systems.
- ✅ Strong communication skills and ability to work collaboratively in a team.
Nice to Have
- Prior experience in financial services or payment systems.
- Exposure to microservices architecture and cloud platforms.
- Knowledge of containerization tools like Docker & Kubernetes.

Role overview
- Overall 5 to 7 years of experience. Node.js experience is must.
- At least 3+ years of experience or couple of large-scale products delivered on microservices.
- Strong design skills on microservices and AWS platform infrastructure.
- Excellent programming skill in Python, Node.js and Java.
- Hands on development in rest API’s.
- Good understanding of nuances of distributed systems, scalability, and availability.
What would you do here
- To Work as a Backend Developer in developing Cloud Web Applications
- To be part of the team working on various types of web applications related to Mortgage Finance.
- Experience in solving a real-world problem of Implementing, Designing and helping develop a new Enterprise-class Product from ground-up.
- You have expertise in the AWS Cloud Infrastructure and Micro-services architecture around the AWS Service stack like Lambdas, SQS, SNS, MySQL Databases along with Dockers and containerized solutions/applications.
- Experienced in Relational and No-SQL databases and scalable design.
- Experience in solving challenging problems by developing elegant, maintainable code.
- Delivered rapid iterations of software based on user feedback and metrics.
- Help the team make key decisions on our product and technology direction.
- You actively contribute to the adoption of frameworks, standards, and new technologies.

AccioJob is conducting a Walk-In Hiring Drive with Infrrd for the position of Java Full Stack Developer.
To apply, register and select your slot here: https://go.acciojob.com/3UTekG
Required Skills: DSA, OOPS, SQL, Java, Python
Eligibility:
- Degree: BTech./BE, MTech./ME
- Branch: Computer Science/CSE/Other CS related branch, IT
- Graduation Year: 2026
Work Details:
- Work Location: Bangalore (Onsite)
- Stipend Range: 30k
- Stipend Duration: 12 Months
- CTC: 6 LPA to 9 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Bangalore Centre
Further Rounds (for shortlisted candidates only):
- Profile Evaluation
- Technical Interview 1
- Technical Interview 2
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/3UTekG

Roles and responsibilities-
- Tech-lead in one of the feature teams, candidate need to be work along with team lead in handling the team without much guidance
- Good communication and leadership skills
- Nurture and build next level talent within the team
- Work in collaboration with other vendors and client development team(s)
- Flexible to learn new tech areas
- Lead complete lifecycle of feature - from feature inception to solution, story grooming, delivery, and support features in production
- Ensure and build the controls and processes for continuous delivery of applications, considering all stages of the process and its automations
- Interact with teammates from across the business and comfortable explaining technical concepts to nontechnical audiences
- Create robust, scalable, flexible, and relevant solutions that help transform product and businesses
Must haves:
- Spark
- Scala
- Postgres(or any SQL DB)s
- Elasticsearch(or any No-SQL DB)
- Azure (if not, any other cloud experience)
- Big data processing
Good to have:
- Golang
- Databricks
- Kubernetes

We’re hiring a Full Stack Developer (5+ years, Pune location) to join our growing team!
You’ll be working with React.js, Node.js, JavaScript, APIs, and cloud deployments to build scalable and high-performing web applications.
Responsibilities include developing responsive apps, building RESTful APIs, working with SQL/NoSQL databases, and deploying apps on AWS/Docker.
Experience with CI/CD, Git, secure coding practices (OAuth/JWT), and Agile collaboration is a must.
If you’re passionate about full stack development and want to work on impactful projects, we’d love to connect!
Position: Tableau Developer
Experience: 5-7 years
Location: Bangalore
Key Responsibilities:
· Design, develop, and maintain interactive dashboards and reports using Tableau, ensuring high-quality visualizations that meet business requirements.
· Write and optimize complex SQL queries to extract, manipulate, and analyse data from various sources, ensuring data integrity and accuracy.
· Stay updated on technologies and trends related to data visualization and analytics, including advanced analytics, big data, and data science. Familiarity with tools such as R, Python, and SAS is a plus.
· Utilize Snowflake for data warehousing solutions, including data modelling, ETL processes, and performance tuning to support Tableau reporting.
· Work effectively in interdisciplinary global teams, influencing stakeholders within a matrix organization to ensure alignment on reporting solutions.
· Incorporate Tableau best practices in reporting solutions and guide team members in their use to enhance overall reporting quality.
· Utilize excellent analytical and problem-solving skills to address data-related challenges and provide actionable insights.
· Communicate effectively with both technical and non-technical stakeholders to understand their reporting needs and deliver tailored solutions.
· Additional Skills: Experience with other visualization tools (e.g., Spotfire, Power BI) and programming languages (e.g., R, Python, JavaScript) is advantageous.
Qualifications:
· Bachelor’s degree in informatics, Information Systems, Data Science, or a related field.
· 5+ years of relevant professional experience in data analytics, performance management, or related fields.
· Strong understanding of clinical development and/or biopharma industry practices is preferred.
· Proven experience in completing computerized systems validation and testing methodologies, with an awarenes
About the company
Sigmoid is a leading data solutions company that partners with Fortune 500 enterprises to drive digital transformation through AI, big data, and cloud technologies. With a focus on scalability, performance, and innovation, Sigmoid delivers cutting-edge solutions to solve complex business challenges.
About the role
You will be responsible for building a highly scalable, extensible, and robust application. This position reports to the Engineering Manager.
Responsibilities:
- Align Sigmoid with key Client initiatives
- Interface daily with customers across leading Fortune 500 companies to understand strategic requirements
- Ability to understand business requirements and tie them to technology solutions
- Open to work from client location as per the demand of the project / customer
- Facilitate in Technical Aspects
- Develop and evolve highly scalable and fault-tolerant distributed components using Java technologies
- Excellent experience in Application development and support, integration development and quality assurance
- Provide technical leadership and manage it day to day basis
- Interface daily with customers across leading Fortune 500 companies to understand strategic requirements
- Stay up-to-date on the latest technology to ensure the greatest ROI for customer & Sigmoid
- Hands on coder with good understanding on enterprise level code
- Design and implement APIs, abstractions and integration patterns to solve challenging distributed computing problems
- Experience in defining technical requirements, data extraction, data transformation, automating jobs, productionizing jobs, and exploring new big data technologies within a Parallel Processing environment
Culture:
- Must be a strategic thinker with the ability to think unconventional / out-of-box
- Analytical and solution driven orientation
- Raw intellect, talent and energy are critical
- Entrepreneurial and Agile: understands the demands of a private, high growth company
- Ability to be both a leader and hands-on "doer"
Qualifications:
- Years of track record of relevant work experience and a computer Science or a related technical discipline is required
- Experience in development of Enterprise scale applications and capable in developing framework, design patterns etc. Should be able to understand and tackle technical challenges and propose comprehensive solutions
- Experience with functional and object-oriented programming, Java or Python is a must
- Experience in Springboot, API, SQL
- Good to have: GIT, Airflow, Node JS, Python, Angular
- Experience with database modeling and development, data mining and warehousing
- Unit, Integration and User Acceptance Testing
- Effective communication skills (both written and verbal)
- Ability to collaborate with a diverse set of engineers, data scientists and product managers
- Comfort in a fast-paced start-up environment
- Experience in Agile methodology
- Proficient with SQL and its variation among popular databases
- Develop, and maintain Java applications using Core Java, Spring framework, JDBC, and threading concepts.
- Strong understanding of the Spring framework and its various modules.
- Experience with JDBC for database connectivity and manipulation
- Utilize database management systems to store and retrieve data efficiently.
- Proficiency in Core Java8 and thorough understanding of threading concepts and concurrent programming.
- Experience in in working with relational and nosql databases.
- Basic understanding of cloud platforms such as Azure and GCP and gain experience on DevOps practices is added advantage.
- Knowledge of containerization technologies (e.g., Docker, Kubernetes)
- Perform debugging and troubleshooting of applications using log analysis techniques.
- Understand multi-service flow and integration between components.
- Handle large-scale data processing tasks efficiently and effectively.
- Hands on experience using Spark is an added advantage.
- Good problem-solving and analytical abilities.
- Collaborate with cross-functional teams to identify and solve complex technical problems.
- Knowledge of Agile methodologies such as Scrum or Kanban
- Stay updated with the latest technologies and industry trends to improve development processes continuously and methodologies
If interested please share your resume with details :
Total Experience -
Relevant Experience in Java,Spring,Data structures,Alogorithm,SQL, -
Relevant Experience in Cloud - AWS/Azure/GCP -
Current CTC -
Expected CTC -
Notice Period -
Reason for change -


Job Title: Backend Engineer – Python / Golang / Rust
Location: Bangalore, India
Experience Required: Minimum 2–3 years
About the Role
We are looking for a passionate Backend Engineer to join our growing engineering team. The ideal candidate should have hands-on experience in building enterprise-grade, scalable backend systems using microservices architecture. You will work closely with product, frontend, and DevOps teams to design, develop, and optimize robust backend solutions that can handle high traffic and ensure system reliability.
Key Responsibilities
• Design, develop, and maintain scalable backend services and APIs.
• Architect and implement microservices-based systems ensuring modularity and resilience.
• Optimize application performance, database queries, and service scalability.
• Collaborate with frontend engineers, product managers, and DevOps teams for seamless delivery.
• Implement security best practices and ensure data protection compliance.
• Write and maintain unit tests, integration tests, and documentation.
• Participate in code reviews, technical discussions, and architecture design sessions.
• Monitor, debug, and improve system performance in production environments.
Required Skills & Experience
• Programming Expertise:
• Advanced proficiency in Python (Django, FastAPI, or Flask), OR
• Strong experience in Golang or Rust for backend development.
• Microservices Architecture: Hands-on experience in designing and maintaining distributed systems.
• Database Management: Expertise in PostgreSQL, MySQL, MongoDB, including schema design and optimization.
• API Development: Strong experience in RESTful APIs and GraphQL.
• Cloud Platforms: Proficiency with AWS, GCP, or Azure for deployment and scaling.
• Containerization & Orchestration: Solid knowledge of Docker and Kubernetes.
• Messaging & Caching: Experience with Redis, RabbitMQ, Kafka, and caching strategies (Redis, Memcached).
• Version Control: Strong Git workflows and collaboration in team environments.
• Familiarity with CI/CD pipelines, DevOps practices, and cloud-native deployments.
• Proven experience working on production-grade, high-traffic applications.
Preferred Qualifications
• Understanding of software architecture patterns (event-driven, CQRS, hexagonal, etc.).
• Experience with Agile/Scrum methodologies.
• Contributions to open-source projects or strong personal backend projects.
• Experience with observability tools (Prometheus, Grafana, ELK, Jaeger).
Why Join Us?
• Work on cutting-edge backend systems that power enterprise-grade applications.
• Opportunity to learn and grow with a fast-paced engineering team.
• Exposure to cloud-native, microservices-based architectures.
• Collaborative culture that values innovation, ownership, and technical excellence.
Desired Competencies (Technical/Behavioral Competency)
Loan IQ Domain and Practical knowledge
Working knowledge of LoanIQ for 7+ yrs in development , support , QA.
In-depth knowledge of LoanIQ Database schema
Mainframe and Java and .Net coupled with DB
Strong knowledge of relational Databases like Oracle 12/19c
Strong business domain / Functional knowledge GOOD nderstanding of Mainframes, SQL, Unix,. Net, Java and Service now
Responsibility of / Expectations from the Role
Loan IQ Domain and Practical knowledge.
In-depth knowledge of LoanIQ Database schema. Mainframe and Java and .Net coupled with DB. Strong knowledge of relational Databases like Oracle
On the ground value driven thought leadership ( continuous improvement ). North American Commercial Banking and Domain Knowledge. Strong business domain / Functional knowledge of batch operations and understanding of Mainframes, SQL, Unix,. Net, Java and Service now
ON OFF coordination on shift basis. Familiarity with Log monitoring applications / software’s. Understanding of various reports. Review, assess and document application, infrastructure and middleware platform service levels requirements, document failure and recovery scenarios, performance against design and expectation
Flexibility to work in shifts. Excellent communication skills
Coordination with the Application Development, Infrastructure Engineering, Production Support as needed. Support triage for escalated issues and on call rotation. Comfortable with generating reporting and creating summaries. Experience with file transfer mechanisms and software. Ability to maintain and enhance complex job dependency charts

Holistic technology solutions for the entertainment and leisure industry.


Required Qualifications & Skills:
- Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience.
- Minimum of 5+ years of professional experience in backend software development, with a strong focus on systems using . NET Framework /. NET Core (C#).
- Proven, significant experience in backend software development using the . NET framework /. NET Core (C#, ASP. NET Web API, Entity Framework).
- Expert-level proficiency in T-SQL, including writing and optimizing complex queries, stored procedures, functions, and understanding database design principles (preferably with MS SQL Server).
- Demonstrated ability to read, understand, and analyze complex legacy code to determine functionality and troubleshoot issues effectively.
- Strong analytical and problem-solving skills, with experience in debugging and resolving production system issues.
- Experience in leading technical initiatives or mentoring junior engineers.
- Formal team lead experience is a strong plus, if for Lead level role.
- Solid understanding of software development lifecycle (SDLC), APIs, and backend architecture patterns.
- Excellent communication and interpersonal skills.
Responsibilities:
- Lead and Mentor: Guide, manage, and mentor a backend engineering team of approximately 4 members, fostering a collaborative and high-performing environment.
- Backend Development: Architect, design, develop, test, and deploy scalable and maintainable backend services, APIs, and database solutions using. NET (C#) and T-SQL.
- Enhancement Ownership: Take ownership of technical design and implementation for new features and enhancements requested for our POS and web platforms.
- Legacy System Expertise: Dive into existing codebases (.NET and T-SQL) to thoroughly understand current system functionality, data flows, and business logic, becoming a subject matter expert.
- Production Support: Act as a key escalation point for diagnosing and resolving complex production issues related to the backend systems. Perform root cause analysis and implement effective solutions.
- Technical Guidance: Provide technical direction, conduct code reviews, establish and enforce coding standards and best practices within the team.
- System Knowledge & Communication: Clearly articulate how backend systems work and answer technical questions from team members and other stakeholders.
- Collaboration: Work closely with front-end developers, QA testers, product managers, and potentially clients to deliver integrated solutions
About the Role
We are seeking motivated Data Engineering Interns to join our team remotely for a 3-month internship. This role is designed for students or recent graduates interested in working with data pipelines, ETL processes, and big data tools. You will gain practical experience in building scalable data solutions. While this is an unpaid internship, interns who successfully complete the program will receive a Completion Certificate and a Letter of Recommendation.
Responsibilities
- Assist in designing and building data pipelines for structured and unstructured data.
- Support ETL (Extract, Transform, Load) processes to prepare data for analytics.
- Work with databases (SQL/NoSQL) for data storage and retrieval.
- Help optimize data workflows for performance and scalability.
- Collaborate with data scientists and analysts to ensure data quality and consistency.
- Document workflows, schemas, and technical processes.
Requirements
- Strong interest in data engineering, databases, and big data systems.
- Basic knowledge of SQL and relational database concepts.
- Familiarity with Python, Java, or Scala for data processing.
- Understanding of ETL concepts and data pipelines.
- Exposure to cloud platforms (AWS, Azure, or GCP) is a plus.
- Familiarity with big data frameworks (Hadoop, Spark, Kafka) is an advantage.
- Good problem-solving skills and ability to work independently in a remote setup.
What You’ll Gain
- Hands-on experience in data engineering and ETL pipelines.
- Exposure to real-world data workflows.
- Mentorship and guidance from experienced engineers.
- Completion Certificate upon successful completion.
- Letter of Recommendation based on performance.
Internship Details
- Duration: 3 months
- Location: Remote (Work from Home)
- Stipend: Unpaid
- Perks: Completion Certificate + Letter of Recommendation


🚀 Hiring: Python Full Stack Developer
⭐ Experience: 4+ Years
📍 Location: Gurgaon
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
🎇 About the Role:-
We are looking for an experienced Python Full Stack Developer (Backend Focus) with 4–6 years of experience to join our dynamic team. You will play a key role in backend development, API design, and data processing, while also contributing to frontend tasks when needed. This position provides excellent opportunities for growth and exposure to cutting-edge technologies.
✨ Required Skills & Experience
✅ Backend Development: Python (Django/Flask), MVC patterns
✅ Databases: SQL, PostgreSQL/MySQL
✅ API Development: RESTful APIs
✅ Testing: pytest, unittest, TDD
✅ Version Control: Git workflows
✅ Frontend Basics: React, JavaScript
✅ DevOps & Tools: Docker basics, CI/CD concepts, JSON/XML/CSV handling
✅ Cloud: Basic Azure knowledge
Profile: AWS Data Engineer
Mandate skills :AWS + Databricks + Pyspark + SQL role
Location: Bangalore/Pune/Hyderabad/Chennai/Gurgaon:
Notice Period: Immediate
Key Requirements :
- Design, build, and maintain scalable data pipelines to collect, process, and store from multiple datasets.
- Optimize data storage solutions for better performance, scalability, and cost-efficiency.
- Develop and manage ETL/ELT processes to transform data as per schema definitions, apply slicing and dicing, and make it available for downstream jobs and other teams.
- Collaborate closely with cross-functional teams to understand system and product functionalities, pace up feature development, and capture evolving data requirements.
- Engage with stakeholders to gather requirements and create curated datasets for downstream consumption and end-user reporting.
- Automate deployment and CI/CD processes using GitHub workflows, identifying areas to reduce manual, repetitive work.
- Ensure compliance with data governance policies, privacy regulations, and security protocols.
- Utilize cloud platforms like AWS and work on Databricks for data processing with S3 Storage.
- Work with distributed systems and big data technologies such as Spark, SQL, and Delta Lake.
- Integrate with SFTP to push data securely from Databricks to remote locations.
- Analyze and interpret spark query execution plans to fine-tune queries for faster and more efficient processing.
- Strong problem-solving and troubleshooting skills in large-scale distributed systems.

• Data Pipeline Development: Design and implement scalable data pipelines using PySpark and Databricks on AWS cloud infrastructure
• ETL/ELT Operations: Extract, transform, and load data from various sources using Python, SQL, and PySpark for batch and streaming data processing
• Databricks Platform Management: Develop and maintain data workflows, notebooks, and clusters in Databricks environment for efficient data processing
• AWS Cloud Services: Utilize AWS services including S3, Glue, EMR, Redshift, Kinesis, and Lambda for comprehensive data solutions
• Data Transformation: Write efficient PySpark scripts and SQL queries to process large-scale datasets and implement complex business logic
• Data Quality & Monitoring: Implement data validation, quality checks, and monitoring solutions to ensure data integrity across pipelines
• Collaboration: Work closely with data scientists, analysts, and other engineering teams to support analytics and machine learning initiatives
• Performance Optimization: Monitor and optimize data pipeline performance, query efficiency, and resource utilization in Databricks and AWS environments
Required Qualifications:
• Experience: 3+ years of hands-on experience in data engineering, ETL development, or related field
• PySpark Expertise: Strong proficiency in PySpark for large-scale data processing and transformations
• Python Programming: Solid Python programming skills with experience in data manipulation libraries (pandas etc)
• SQL Proficiency: Advanced SQL skills including complex queries, window functions, and performance optimization
• Databricks Experience: Hands-on experience with Databricks platform, including notebook development, cluster management, and job scheduling
• AWS Cloud Services: Working knowledge of core AWS services (S3, Glue, EMR, Redshift, IAM, Lambda)
• Data Modeling: Understanding of dimensional modeling, data warehousing concepts, and ETL best practices
• Version Control: Experience with Git and collaborative development workflows
Preferred Qualifications:
• Education: Bachelor's degree in Computer Science, Engineering, Mathematics, or related technical field
• Advanced AWS: Experience with additional AWS services like Athena, QuickSight, Step Functions, and CloudWatch
• Data Formats: Experience working with various data formats (JSON, Parquet, Avro, Delta Lake)
• Containerization: Basic knowledge of Docker and container orchestration
• Agile Methodology: Experience working in Agile/Scrum development environments
• Business Intelligence Tools: Exposure to BI tools like Tableau, Power BI, or Databricks SQL Analytics
Technical Skills Summary:
Core Technologies:
- PySpark & Spark SQL
- Python (pandas, boto3)
- SQL (PostgreSQL, MySQL, Redshift)
- Databricks (notebooks, clusters, jobs, Delta Lake)
AWS Services:
- S3, Glue, EMR, Redshift
- Lambda, Athena
- IAM, CloudWatch
Development Tools:
- Git/GitHub
- CI/CD pipelines, Docker
- Linux/Unix command line
Location & Work Model:
- Position: Contract to hire
- Bangalore - Marathahalli;
- Work mode - Work from Office; all 5 days
- Looking for immediate joiners
Technical Requirements:
- Strong experience in Java Backend Development
- Proficiency in both SQL & NoSQL databases (e.g., MongoDB, PostgreSQL, MySQL)
- Basic knowledge of DevOps tools (CI/CD pipeline)
- Familiarity with cloud providers (AWS, GCP)
- Ability to quickly learn and adapt to emerging technologies, including AI-driven tools, and automation solutions
- Strong problem-solving mindset with an interest in leveraging AI and data-driven approaches for backend optimizations
Soft Skills & Mindset:
- Strong communication & articulation skills
- Structured thought process and a logical approach to problem-solving
- Interest & willingness to learn and adapt to new technologies
Roles and Responsibilities:
- Develop high-quality code and employ object-oriented design principles while strictly adhering to best coding practices.
- Ability to work independently
- Demonstrate substantial expertise in Core Java, Multithreading, and Spring/Spring Boot frameworks.
- Possess a solid understanding of Spring, Hibernate, Caching Frameworks, and Memory Management.
- Proficient in crafting complex analytical SQL queries to meet project requirements.
- Contribute to the development of Highly Scalable applications.
- Design and implement Rest-based applications with efficiency and precision.
- Create comprehensive unit tests using frameworks such as Junit and Mockito.
- Engage in the Continuous Integration/Continuous Deployment (CI/CD) process and utilize build tools like Git and Maven.
- Familiarity with any Cloud service provider is considered an added advantage.
Required Skills and Experience :
- Experience in cloud platform AWS is necessary.
- Experience in big data processing frameworks like Apache Spark, Flink, Kafka, etc
- In-depth proficiency in Java, Spring Boot, Spring Frameworks, Hibernate, SQL, and Unit Testing frameworks.
- Good knowledge of SQL and experience with complex queries is a must
- Experience in supporting and debugging issues in the production environment is a must.
- Experience in Analytical databases like Redshift, Big Query, Snowflake, Clickhouse, etc is a plus.

Job Title: React.js Developer
We are seeking a talented React.js Developer to build and maintain high-quality web applications. The ideal candidate should have strong experience in JavaScript/TypeScript, React.js, HTML, CSS, and state management (Redux/Context API). You will work closely with our team to develop responsive UIs, integrate APIs, and ensure performance optimization.
Requirements:
- 2+ years of experience in React.js development
- Strong knowledge of JavaScript (ES6+), React hooks, and component-based architecture
- Familiarity with RESTful APIs, Git, and modern front-end tools
- Bonus: Experience with Next.js, Tailwind, or testing frameworks
About The Company:
Dolat is a dynamic team of traders, puzzle solvers, and coding enthusiasts focused on tackling complex challenges in the financial world. We specialize in trading in volatile markets and developing cutting-edge technologies and strategies. We're seeking a skilled Linux Support Engineer to manage over 400 servers and support 100+ users. Our engineers ensure a high-performance, low-latency environment while maintaining simplicity and control. If you're passionate about technology, trading, and problem-solving, this is the place to engineer your skills into a rewarding career.
Qualifications:
- B.E/ B.Tech
- Experience: 1-3 years.
- Job location – Andheri West, Mumbai.
Responsibilities:
- Troubleshoot network issues, kernel panics, system hangs, and performance bottlenecks.
- Fine-tune processes for minimal jitter in a low-latency environment.
- Support low-latency servers, lines, and networks, participating in on-call rotations.
- Install, configure, and deploy fully-distributed Red Hat Linux systems.
- Deploy, configure, and monitor complex trading applications.
- Provide hands-on support to trading, risk, and compliance teams (Linux & Windows platforms).
- Automate processes and analyze performance metrics to improve system efficiency.
- Collaborate with development teams to maintain a stable, high-performance trading environment.
- Drive continuous improvement, system reliability, and simplicity.
- Resolve issues in a fast-paced, results-driven IT team.
- Provide level one and two support for tools systems testing and production release.
Skills Required:
- Expertise in Linux kernel tuning, configuration management (cfengine).
- Experience with hardware testing/integration and IT security.
- Proficient in maintaining Cisco, Windows, and PC hardware.
- Good knowledge in Perl, Python, Powershell & Bash.
- Hands-on knowledge of SSH, iptables, NFS, DNS, DHCP, and LDAP.
- Experience with Open Source tools (Nagios, SmokePing, MRTG) for enterprise-level systems.
- Knowledge of maintaining Cisco ports/VLANs/dot1.x
- Solid understanding of OS and network architectures.
- REDHAT Certification and SQL/database knowledge.
- Ability to manage multiple tasks in a fast-paced environment.
- Excellent communication skills and fluency in English.
- Proven technical problem-solving capabilities.
- Strong documentation and knowledge management skills.
- Software development skills are a bonus.
- Preferred SQL and database administration skills.
Industry
- Financial Services
Senior Back-end Engineer Developer
What We Need
Looking for a senior back-end developer who will start working in our Bangalore office and then will be given an opportunity to move to Netherlands to work closely with our clients
- A highly motivated and experienced frontend software engineer / developer with a proven track record (at least 5 years of experience).
- A Bachelor’s degree in computer science.
- Someone who loves to work in a multidisciplinary team of engineers and business colleagues in a high-tech environment.
- You are able to work in a dynamic and demanding environment, a real team player and a speak-up mentality to promote your ideas in a concise way.
- You are a problem-solver and see yourself as a hardcore web developer.
- You have knowledge of, and experience with, different web technologies.
- You are skilled with implementing architecture & design patterns.
- You can write modular code that is configurable, extensible and testable.
- You have great analytical skills, conceptual understanding and able to quickly understand new technical concepts.
- You have a strong interest in the latest trends in software development & web technologies.
- You have strong communication skills to explain complex technical concepts.
- You are fluent in English both in verbal and written.
We are looking for a back-end engineer / developer:
Proficiency / experience with following technologies & tools:
- Thorough and deep understanding of Java JDK 11+, our foundational programming language
Spring Framework & AOP v5.2+
- Proven experience working with, and a deep understanding of Spring Boot 2.5+ and its modules (Web, Data JPA, Security OAuth2) and ability to explain complex use-cases related to persistency and web security
- Experience with Maven v3+
- Experience with containerization and deployments tools (eg. Docker v20+ and Kaniko, Helm (charts) v3+ with Kubernetes deployments)
- Experience working with CI/CD tools like GitLab SCM & pipelines and JFrog Artifactory
- Strong knowledge working with different types of SQL and NoSQL databases such as PostgreSQL v12+, MongoDB v4+ and Neo4J v4+
- Proficient in working with DevOps engineers on Cloud deployments (eg. Azure subscriptions)
- Experience in Agile/Scrum & (pref.) SAFe (Scaled Agile Framework) and enabling tooling – Atlassian Jira Cloud / Jira Align
- Experienced and skilled in full-stack development.
- Leading and solutioning product development of secure and high-performance applications.
- Good understanding of REST APIs and working knowledge of HTTP(S).
- Experienced in testing stack – Junit / Mockito
- Experience with software quality & vulnerability testing – SonarQube and Blackduck
- Proficient in writing software documentation on Atlassian Wiki
- Proficient in implementing data structures, algorithm design and OOPs concepts.


About Data Axle:
Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for over 50 years in the USA. Data Axle now as an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business & consumer databases.
Data Axle Pune is pleased to have achieved certification as a Great Place to Work!
Roles & Responsibilities:
We are looking for a Data Scientist to join the Data Science Client Services team to continue our success of identifying high quality target audiences that generate profitable marketing return for our clients. We are looking for experienced data science, machine learning and MLOps practitioners to design, build and deploy impactful predictive marketing solutions that serve a wide range of verticals and clients. The right candidate will enjoy contributing to and learning from a highly talented team and working on a variety of projects.
We are looking for a Senior Data Scientist who will be responsible for:
- Ownership of design, implementation, and deployment of machine learning algorithms in a modern Python-based cloud architecture
- Design or enhance ML workflows for data ingestion, model design, model inference and scoring
- Oversight on team project execution and delivery
- If senior, establish peer review guidelines for high quality coding to help develop junior team members’ skill set growth, cross-training, and team efficiencies
- Visualize and publish model performance results and insights to internal and external audiences
Qualifications:
- Masters in a relevant quantitative, applied field (Statistics, Econometrics, Computer Science, Mathematics, Engineering)
- Minimum of 3.5 years of work experience in the end-to-end lifecycle of ML model development and deployment into production within a cloud infrastructure (Databricks is highly preferred)
- Proven ability to manage the output of a small team in a fast-paced environment and to lead by example in the fulfilment of client requests
- Exhibit deep knowledge of core mathematical principles relating to data science and machine learning (ML Theory + Best Practices, Feature Engineering and Selection, Supervised and Unsupervised ML, A/B Testing, etc.)
- Proficiency in Python and SQL required; PySpark/Spark experience a plus
- Ability to conduct a productive peer review and proper code structure in Github
- Proven experience developing, testing, and deploying various ML algorithms (neural networks, XGBoost, Bayes, and the like)
- Working knowledge of modern CI/CD methods This position description is intended to describe the duties most frequently performed by an individual in this position.
It is not intended to be a complete list of assigned duties but to describe a position level.
Reports Developer
Description - Data Insights Analyst specializing in dashboard development, data validation, and ETL testing using Tableau, Cognos, and SQL.
Work Experience: 5-9 years
Key Responsibilities
Insights Solution Development:
• Develop, maintain, and enhance dashboards and static reports using Tableau and IBM Cognos.
• Collaborate with Senior Data Insights specialists to design solutions that meet customer needs.
• Utilize data from modeled Business Data sources, Structured DB2 Datamarts, and DB2 Operational Data stores to fulfill business requirements.
• Conduct internal analytics testing on all data products to ensure accuracy and reliability.
• Use SQL to pull and prepare data for use in Dashboards and Reports
Data Management Tasks:
• Test new ETL developments, break-fixes, and enhancements using SQL-based tools to ensure the accuracy, volume, and quality of ETL changes.
• Participate in data projects, delivering larger-scale data solutions as part of a project team.
• Report defects using Jira and work closely with IT team professionals to ensure the timely retesting of data defects
• Utilize Spec. Documentation and Data lineage tools to understand flow of data into Analytics Data sources
• Develop repeatable testing processes using SQL based tools
Technical Experience
• SQL
• Tableau
• Data Visualization
• Report Design
• Cognos Analytics
• Cognos Transformer
• OLAP Modeling (Cognos)
Additional Skills
An Ideal Candidate would have the following additional Skills
• Python
• SAS Programming
• MS Access
• MS Excel
Work Hours: We would like to have the majority of the work hours align to U.S. Eastern time zone with people working until 2 p.m. or 3 p.m. est. so that work hours align to times the Senior Analysts are available and Data Bases are available.

Job Description: .NET + Angular Full Stack Developer
Position: Full Stack Developer (.NET + Angular)
Experience: 3 – 5 Years
About the Role
We are looking for a highly skilled .NET Angular Full Stack Developer to join our dynamic team. The ideal candidate should have strong expertise in both back-end and front-end development, hands-on experience with .NET Core and Angular, and a passion for building scalable, secure, and high-performance applications.
Key Responsibilities
- Design, develop, and maintain scalable, high-quality web applications using .NET Core 8, ASP.NET MVC, Web API, and Angular 13+.
- Build and integrate RESTful APIs and ensure seamless communication between front-end and back-end services.
- Develop, optimize, and maintain SQL Server (2012+) databases, ensuring high availability, performance, and reliability.
- Write complex stored procedures, functions, triggers, and perform query tuning and indexing for performance optimization.
- Work with Entity Framework/EF Core to implement efficient data access strategies.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Implement OAuth 2.0 authentication/authorization for secure access control.
- Write clean, testable, and maintainable code following Test-Driven Development (TDD) principles.
- Use GIT / TFVC for version control and collaborate using Azure DevOps Services for CI/CD pipelines.
- Participate in code reviews, troubleshoot issues, and optimize application performance.
- Stay updated with emerging technologies and recommend improvements to enhance system architecture.
Required Technical Skills
- 3+ years of experience in .NET development (C#, .NET Core 8, ASP.NET MVC, Web API).
- Strong experience in SQL Server development including:
- Query tuning, execution plan analysis, and performance optimization.
- Designing and maintaining indexes, partitioning strategies, and database normalization.
- Handling large datasets and optimizing stored procedures for scalability.
- Experience with SQL Profiler, Extended Events, and monitoring tools.
- Proficiency in Entity Framework / EF Core for ORM-based development.
- Familiarity with PostgreSQL and cross-database integration is a plus.
- Expertise in Angular 13+, HTML5, CSS, TypeScript, JavaScript, and Bootstrap.
- Experience with REST APIs development and integration.
- Knowledge of OAuth 2.0 and secure authentication methods.
- Hands-on experience with GIT/TFVC and Azure DevOps for source control and CI/CD pipelines.
- Basic knowledge of Node.js framework is a plus.
- Experience with unit testing frameworks like NUnit, MSTest, etc.
Soft Skills
- Strong problem-solving and analytical skills, particularly in debugging performance bottlenecks.
- Excellent communication and collaboration abilities.
- Ability to work independently and in a team environment.
- Attention to detail and a passion for writing clean, scalable, and optimized code.
Apply here: https://forms.gle/DefR28CvNfepJT3o6
Roles and Responsibilities:
You’ll work closely with Goalkeep’s internal team and support client projects as needed. Your responsibilities will include:
- Infrastructure Maintenance & Optimization
- Design and review data pipeline diagrams for both client and internal projects
- Build data pipelines by writing clean, efficient SQL queries for data analysis and quality checks
- Monitor data pipeline performance and proactively raise alarms and fix issues
- Maintain and upgrade Goalkeep’s internal tech infrastructure
- Monitor infrastructure costs to identify inefficiencies and recommend cost-saving strategies across cloud services and tech subscriptions
- Internal Tech Enablement & SupportProvide tech setup and troubleshooting support to internal teams and analysts so that we can successfully deliver on client projects
- Assist with onboarding new team members onto Goalkeep systems
What we’re looking for:
Hard Skills:
- Data modeling and database management (PostgreSQL, MySQL, or SQL Server)
- Installing and maintaining software on Linux systems
- Familiarity with cloud-based platforms (AWS / GCP / Azure) is a must
- Ability to troubleshoot based on system logs and performance indicators
- Data engineering: writing efficient SQL, designing pipelines
Soft Skills & Mindsets:
- Curiosity and accountability when investigating system issues
- Discipline to proactively maintain and monitor infra
Must-Know Tools:
- SQL (any dialect)
- Knowledge of software engineering best practices and version control (Git)
Preferred Qualifications:
- Engineering degree with a minimum 2 years of experience working as a data scientist /data analyst
- Bachelor's degree in any STEM field
What’s in it for you?
- The chance to work at the intersection of social impact and technology.
- Learn and grow in areas such as cloud infrastructure, data governance, and data engineering.
- Be part of a close-knit team passionate about using data for good.
We are looking for an experienced DB2 developer/DBA who has worked in a critical application
with large sized Database. The role requires the candidate to understand the landscape of the
application and the data including its topology across the Online Data store and the Data
Warehousing counter parts. The challenges we strive to solve include scalability/performance
related to dealing with very large data sets and multiple data sources.
The role involves collaborating with global team members and provides a unique opportunity to
network with a diverse group of people.
The candidate who fills this role of a Database developer in our team will be involved in building
and creating solutions from the requirements stage through deployment. A successful candidate
is self-motivated, innovative, thinks outside the box, has excellent communication skills and can
work with clients and stakeholders from both the business and technology with
ease.
Required Skills:
Expertise in writing complex data retrieval queries, stored procs and performance tuning
Experience in migrating large scale database from Sybase to a new tech stack
Expertise in relational DB: Sybase, AZURE SQL Server, DB2 and nosgl databases
Strong knowledge in Linux Shell Scripting
Working knowledge of Python programming
Working knowledge of Informatica
Good knowledge of Autosys or any such scheduling tool
Detail oriented, ability to turn deliverables around quickly with high degree of accuracy Strong
analytical skills, ability to interpret business requirements and produce functional and technical
design documents
Good time management skills - ability to prioritize and multi-task, handling multiple efforts at
once
Strong desire to understand and learn domain.
Desired Skills:
Experience in Sybase, AZURE SQL Server, DB2
Experience in migrating relational database to modern tech stack
Experience in a financial services/banking industry

Full Stack Engineer (Frontend Strong, Backend Proficient)
5-10 Years Experience
Contract: 6months+extendable
Location: Remote
Technical Requirements Frontend Expertise (Strong)
*Need at least 4 Yrs in React web developement, Node & AI.*
● Deep proficiency in React, Next.js, TypeScript
● Experience with state management (Redux, Context API)
● Frontend testing expertise (Jest, Cypress)
● Proven track record of achieving high Lighthouse performance scores Backend Proficiency
● Solid experience with Node.js, NestJS (preferred), or ExpressJS
● Database management (SQL, NoSQL)
● Cloud technologies experience (AWS, Azure)
● Understanding of OpenAI and AI integration capabilities (bonus) Full Stack Integration
● Excellent ability to manage and troubleshoot integration issues between frontend and backend systems
● Experience designing cohesive systems with proper separation of concerns
Job Title: MERN TECH
Location: Paschim Vihar, West Delhi
Company: Eye Mantra
Job Type: Full-time, Onsite
Salary: Commensurate with experience and interview performance
Experience:- 3+ years
Contact: +91 97180 11146(Rizwana Siddique,HR)
Interview Mode: Face to Face
About Eye Mantra:
Eye Mantra is a premier eye care organization committed to delivering exceptional services using advanced technology. We’re growing fast and looking to strengthen our in-house tech team with talented individuals who share our passion for innovation and excellence in patient care.
Position Overview:
We are currently hiring a skilled Full Stack Developer to join our in-house development team. If you have strong experience working with the MERN stack, including Node.js, React.js, MongoDB, and SQL, and you thrive in a collaborative, fast-paced work environment, we’d love to connect with you.
This role requires working onsite at our West Delhi office (Paschim Vihar), where you’ll contribute directly to building and maintaining robust, scalable, and user-friendly applications that support our medical operations and patient services.
Responsibilities:
- Build and manage web applications using the MERN stack (MongoDB, Express, React, Node).
- Create and maintain efficient backend services and RESTful APIs.
- Develop intuitive frontend interfaces with React.js.
- Design and optimize relational databases using SQL.
- Work closely with internal teams to implement new features and enhance existing ones.
- Ensure applications perform well across all platforms and devices.
- Identify and resolve bugs and performance issues quickly.
- Stay current with emerging web development tools and trends.
- (Bonus) Leverage AWS or other cloud platforms to enhance scalability and performance.
Required Skills & Qualifications:
- Proficiency in Node.js for backend programming.
- Strong hands-on experience with React.js for frontend development.
- Good command of SQL and understanding of database design.
- Practical knowledge of the MERN stack.
- Experience using Git for version control and team collaboration.
- Excellent analytical and problem-solving abilities.
- Strong interpersonal and communication skills.
- Self-motivated, with the ability to manage tasks independently.