50+ SQL Jobs in India
Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Role: Lead Java Developer
Work Location: Chennai, Pune
No of years’ experience: 8+ years
Hybrid (3 days office and 2 days home)
Type: Fulltime
Skill Set: Java + Spring Boot + Sql + Microservices + DevOps
Job Responsibilities:
Design, develop, and maintain high-quality software applications using Java and Spring Boot.
Develop and maintain RESTful APIs to support various business requirements.
Write and execute unit tests using TestNG to ensure code quality and reliability.
Work with NoSQL databases to design and implement data storage solutions.
Collaborate with cross-functional teams in an Agile environment to deliver high-quality software solutions.
Utilize Git for version control and collaborate with team members on code reviews and merge requests.
Troubleshoot and resolve software defects and issues in a timely manner.
Continuously improve software development processes and practices.
Description:
8 years of professional experience in backend development using Java and leading a team.
Strong expertise in Spring Boot, Apache Camel, Hibernate, JPA, and REST API design
Hands-on experience with PostgreSQL, MySQL, or other SQL-based databases
Working knowledge of AWS cloud services (EC2, S3, RDS, etc.)
Experience in DevOps activities.
Proficiency in using Docker for containerization and deployment.
Strong understanding of object-oriented programming, multithreading, and performance tuning
Self-driven and capable of working independently with minimal supervision
Role Summary:
We are seeking experienced Application Support Engineers to join our client-facing support team. The ideal candidate will
be the first point of contact for client issues, ensuring timely resolution, clear communication, and high customer satisfaction
in a fast-paced trading environment.
Key Responsibilities:
• Act as the primary contact for clients reporting issues related to trading applications and platforms.
• Log, track, and monitor issues using internal tools and ensure resolution within defined TAT (Turnaround Time).
• Liaise with development, QA, infrastructure, and other internal teams to drive issue resolution.
• Provide clear and timely updates to clients and stakeholders regarding issue status and resolution.
• Maintain comprehensive logs of incidents, escalations, and fixes for future reference and audits.
• Offer appropriate and effective resolutions for client queries on functionality, performance, and usage.
• Communicate proactively with clients about upcoming product features, enhancements, or changes.
• Build and maintain strong relationships with clients through regular, value-added interactions.
• Collaborate in conducting UAT, release validations, and production deployment verifications.
• Assist in root cause analysis and post-incident reviews to prevent recurrences.
Required Skills & Qualifications:
• Bachelor's degree in Computer Science, IT, or related field.
• 2+ years in Application/Technical Support, preferably in the broking/trading domain.
• Sound understanding of capital markets – Equity, F&O, Currency, Commodities.
• Strong technical troubleshooting skills – Linux/Unix, SQL, log analysis.
• Familiarity with trading systems, RMS, OMS, APIs (REST/FIX), and order lifecycle.
• Excellent communication and interpersonal skills for effective client interaction.
• Ability to work under pressure during trading hours and manage multiple priorities.
• Customer-centric mindset with a focus on relationship building and problem-solving.

At Rocketium, engineers aren’t just writing code. We’re building the engine that powers creativity at scale. Imagine helping a global brand launch hundreds of personalized ads in a day instead of weeks, or designing workflows where AI takes care of the grunt work so humans can focus on storytelling. That’s the kind of impact your code will have here.
As a Software Development Engineer with us, you will:
- Bring ideas to life: Work with product managers and designers to turn messy, real-world creative challenges into elegant software solutions.
- Build for scale: Write reliable, performant code that can handle thousands of assets being generated, automated, and personalized every day.
- Shape the future of AI workflows: Experiment with AI/automation to create features that change how creative teams think about production.
- Make the platform better, every day: Debug tricky issues, optimize performance, and refactor where needed — because we value craft as much as speed.
- Grow with the team: Share openly, learn from peers, and give feedback that sharpens everyone’s work.
Who you are:
- Your toolkit includes JavaScript/TypeScript, React, and Node.js, and you’re quick to pick up new tools as needed.
- You’re comfortable switching hats: debugging tricky issues one moment, brainstorming workflows with designers the next.
- Ambiguity doesn’t scare you; you enjoy finding clarity where none exists.
- Bonus points if you’ve dabbled in AI, workflow automation, or creative tech.
- Most importantly, you embody our prime directives.
This role requires a strong understanding of the entire Microsoft Power Platform suite, including Power Apps, Power Automate, and Power BI.
The candidate will design, develop, and implement solutions that streamline business processes, automate workflows, and create insightful dashboards.
Key Responsibilities & Skills
• Power Apps: Develop canvas apps and model-driven apps, build custom connectors, and leverage advanced features such as components, data flows, and authentication. ,
• Power Automate: Build robust workflows using triggers, actions, conditional logic, cloud connectors, and implement error handling and exception management.
• Power BI: Data modeling, creating interactive dashboards, advanced calculations with DAX, and drill-down visualizations.
• Integration: Connect Power Platform solutions with Office 365, Dynamics 365, Azure Logic Apps, and external APIs.
• Security: Implement security best practices, data governance, role-based access control, and encryption.
• Version Control: Use Git or similar version control systems for managing projects.
• Database Understanding: Strong foundation in relational databases, writing SQL queries for data extraction and manipulation.


Position: Senior Data Engineer
Overview:
We are seeking an experienced Senior Data Engineer to design, build, and optimize scalable data pipelines and infrastructure to support cross-functional teams and next-generation data initiatives. The ideal candidate is a hands-on data expert with strong technical proficiency in Big Data technologies and a passion for developing efficient, reliable, and future-ready data systems.
Reporting: Reports to the CEO or designated Lead as assigned by management.
Employment Type: Full-time, Permanent
Location: Remote (Pan India)
Shift Timings: 2:00 PM – 11:00 PM IST
Key Responsibilities:
- Design and develop scalable data pipeline architectures for data extraction, transformation, and loading (ETL) using modern Big Data frameworks.
- Identify and implement process improvements such as automation, optimization, and infrastructure re-design for scalability and performance.
- Collaborate closely with Engineering, Product, Data Science, and Design teams to resolve data-related challenges and meet infrastructure needs.
- Partner with machine learning and analytics experts to enhance system accuracy, functionality, and innovation.
- Maintain and extend robust data workflows and ensure consistent delivery across multiple products and systems.
Required Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or related field.
- 10+ years of hands-on experience in Data Engineering.
- 5+ years of recent experience with Apache Spark, with a strong grasp of distributed systems and Big Data fundamentals.
- Proficiency in Scala, Python, Java, or similar languages, with the ability to work across multiple programming environments.
- Strong SQL expertise and experience working with relational databases such as PostgreSQL or MySQL.
- Proven experience with Databricks and cloud-based data ecosystems.
- Familiarity with diverse data formats such as Delta Tables, Parquet, CSV, and JSON.
- Skilled in Linux environments and shell scripting for automation and system tasks.
- Experience working within Agile teams.
- Knowledge of Machine Learning concepts is an added advantage.
- Demonstrated ability to work independently and deliver efficient, stable, and reliable software solutions.
- Excellent communication and collaboration skills in English.
About the Organization:
We are a leading B2B data and intelligence platform specializing in high-accuracy contact and company data to empower revenue teams. Our technology combines human verification and automation to ensure exceptional data quality and scalability, helping businesses make informed, data-driven decisions.
What We Offer:
Our workplace embraces diversity, inclusion, and continuous learning. With a fast-paced and evolving environment, we provide opportunities for growth through competitive benefits including:
- Paid Holidays and Leaves
- Performance Bonuses and Incentives
- Comprehensive Medical Policy
- Company-Sponsored Training Programs
We are an Equal Opportunity Employer, committed to maintaining a workplace free from discrimination and harassment. All employment decisions are made based on merit, competence, and business needs.

Responsibilities
Develop and maintain web and backend components using Python, Node.js, and Zoho tools
Design and implement custom workflows and automations in Zoho
Perform code reviews to maintain quality standards and best practices
Debug and resolve technical issues promptly
Collaborate with teams to gather and analyze requirements for effective solutions
Write clean, maintainable, and well-documented code
Manage and optimize databases to support changing business needs
Contribute individually while mentoring and supporting team members
Adapt quickly to a fast-paced environment and meet expectations within the first month
Selection Process
1. HR Screening: Review of qualifications and experience
2. Online Technical Assessment: Test coding and problem-solving skills
3. Technical Interview: Assess expertise in web development, Python, Node.js, APIs, and Zoho
4. Leadership Evaluation: Evaluate team collaboration and leadership abilities
5. Management Interview: Discuss cultural fit and career opportunities
6. Offer Discussion: Finalize compensation and role specifics
Experience Required
2-4 years of relevant experience as a Zoho Developer
Proven ability to work as a self-starter and contribute individually
Strong technical and interpersonal skills to support team members effectively
Position Overview
We're seeking a skilled Full Stack Developer to build and maintain scalable web applications using modern technologies. You'll work across the entire development stack, from database design to user interface implementation.
Key Responsibilities
- Develop and maintain full-stack web applications using Node.js and TypeScript
- Design and implement RESTful APIs and microservices
- Build responsive, user-friendly front-end interfaces
- Design and optimize SQL databases and write efficient queries
- Collaborate with cross-functional teams on feature development
- Participate in code reviews and maintain high code quality standards
- Debug and troubleshoot application issues across the stack
Required Skills
- Backend: 3+ years experience with Node.js and TypeScript
- Database: Proficient in SQL (PostgreSQL, MySQL, or similar)
- Frontend: Experience with modern JavaScript frameworks (React, Vue, or Angular)
- Version Control: Git and collaborative development workflows
- API Development: RESTful services and API design principles
Preferred Qualifications
- Experience with cloud platforms (AWS, Azure, or GCP)
- Knowledge of containerization (Docker)
- Familiarity with testing frameworks (Jest, Mocha, or similar)
- Understanding of CI/CD pipelines
What We Offer
- Competitive salary and benefits
- Flexible work arrangements
- Professional development opportunities
- Collaborative team environment
Key Responsibilities:
● Analyze and translate legacy MSSQL stored procedures into Snowflake Scripting (SQL) or JavaScript-based stored procedures.
● Rebuild and optimize data pipelines and transformation logic in Snowflake.
● Implement performance-tuning techniques such as query pruning, clustering keys, appropriate warehouse sizing, and materialized views.
● Monitor query performance using the Snowflake Query Profile and resolve bottlenecks.
● Ensure procedures are idempotent, efficient, and scalable for high-volume workloads.
● Collaborate with architects and data teams to ensure accurate and performant data migration.
● Write test cases to validate functional correctness and performance.
● Document changes and follow version control best practices (e.g., Git, CI/CD).
Required Skills:
● 4+ years of SQL development experience, including strong T-SQL proficiency.
● 2+ years of hands-on experience with Snowflake, including stored procedure development.
● Deep knowledge of query optimization and performance tuning in Snowflake.
● Familiarity with Snowflake internals: automatic clustering, micro-partitioning, result caching, and warehouse scaling.
● Solid understanding of ETL/ELT processes, preferably with tools like DBT, Informatica, or Airflow.
● Experience with CI/CD pipelines and Git-based version control
Note : One face-to-face (F2F) round is mandatory, and as per the process, you will need to visit the office for this.

Job Description: Data Engineer
Location: Ahmedabad
Experience: 7+ years
Employment Type: Full-Time
We are looking for a highly motivated and experienced Data Engineer to join our team. As a Data Engineer, you will play a critical role in designing, building, and optimizing data pipelines that ensure the availability, reliability, and performance of our data infrastructure. You will collaborate closely with data scientists, analysts, and cross-functional teams to provide timely and efficient data solutions.
Responsibilities
● Design and optimize data pipelines for various data sources
● Design and implement efficient data storage and retrieval mechanisms
● Develop data modelling solutions and data validation mechanisms
● Troubleshoot data-related issues and recommend process improvements
● Collaborate with data scientists and stakeholders to provide data-driven insights and solutions
● Coach and mentor junior data engineers in the team
Skills Required:
● Minimum 5 years of experience in data engineering or related field
● Proficient in designing and optimizing data pipelines and data modeling
● Strong programming expertise in Python
● Hands-on experience with big data technologies such as Hadoop, Spark, and Hive
● Extensive experience with cloud data services such as AWS, Azure, and GCP
● Advanced knowledge of database technologies like SQL, NoSQL, and data warehousing
● Knowledge of distributed computing and storage systems
● Familiarity with DevOps practices and power automate and Microsoft Fabric will be an added advantage
● Strong analytical and problem-solving skills with outstanding communication and collaboration abilities
Qualifications
● Bachelor's degree in Computer Science, Data Science, or a Computer related field
Required Qualifications
- Bachelor’s degree Commerce background / MBA Finance (mandatory).
- 3+ years of hands-on implementation/project management experience
- Proven experience delivering projects in Fintech, SaaS, or ERP environments
- Strong expertise in accounting principles, R2R (Record-to-Report), treasury, and financial workflows.
- Hands-on SQL experience, including the ability to write and debug complex queries (joins, CTEs, subqueries)
- Experience working with ETL pipelines or data migration processes
- Proficiency in tools like Jira, Confluence, Excel, and project tracking systems
- Strong communication and stakeholder management skills
- Ability to manage multiple projects simultaneously and drive client success
Preferred Qualifications
- Prior experience implementing financial automation tools (e.g., SAP, Oracle, Anaplan, Blackline)
- Familiarity with API integrations and basic data mapping
- Experience in agile/scrum-based implementation environments
- Exposure to reconciliation, book closure, AR/AP, and reporting systems
- PMP, CSM, or similar certifications
A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.
Required Skills:
- Exp. - 3 to 7 years.
- Experience in Core Java and Spring Boot.
- Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.
- Should have the ability to analyze, design, develop and test complex, low-latency client- facing applications.
- Good development experience with RDBMS.
- Good knowledge of multi-threading and high-performance server-side development.
- Basic working knowledge of Unix/Linux.
- Excellent problem solving and coding skills.
- Strong interpersonal, communication and analytical skills.
- Should have the ability to express their design ideas and thoughts.
About Wissen Technology:
Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals.
Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’.
Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.
Required Skills and Qualifications :
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Proven experience as a Data Modeler or in a similar role at a asset manager or financial firm.
- Strong Understanding of various business concepts related to buy side financial firms. Understanding of Private Markets (Private Credit, Private Equity, Real Estate, Alternatives) is required.
- Strong understanding of database design principles and data modeling techniques (e.g., ER modeling, dimensional modeling).
- Knowledge of SQL and experience with relational databases (e.g., Oracle, SQL Server, MySQL).
- Familiarity with NoSQL databases is a plus.
- Excellent analytical and problem-solving skills.
- Strong communication skills and the ability to work collaboratively.
Preferred Qualifications:
- Experience in data warehousing and business intelligence.
- Knowledge of data governance practices.
- Certification in data modeling or related fields.
Key Responsibilities :
- Design and develop conceptual, logical, and physical data models based on business requirements.
- Collaborate with stakeholders in finance, operations, risk, legal, compliance and front offices to gather and analyze data requirements.
- Ensure data models adhere to best practices for data integrity, performance, and security.
- Create and maintain documentation for data models, including data dictionaries and metadata.
- Conduct data profiling and analysis to identify data quality issues.
- Conduct detailed meetings and discussions with business to translate broad business functionality requirements into data concepts, data models and data products.
We are looking for an experienced Java Support Engineer with 4+ years of hands-on experience in supporting and maintaining Java/Spring Boot-based applications. The ideal candidate will be responsible for production support, debugging issues, and ensuring smooth application performance.
Key Responsibilities:
- Provide L2/L3 support for Java/Spring Boot applications in production and non-production environments.
- Perform incident analysis, root cause identification, and apply quick fixes or permanent solutions.
- Handle application deployments, environment monitoring, and performance tuning.
- Collaborate with development, DevOps, and database teams to resolve technical issues.
- Write and debug SQL queries, manage data fixes, and ensure database integrity.
- Use monitoring tools like Splunk, Kibana, or ELK Stack for issue investigation.
- Prepare documentation for recurring issues and maintain knowledge base.
Technical Skills:
- Strong in Core Java, Spring Boot, RESTful APIs
- Good knowledge of SQL / PL-SQL (Oracle / MySQL / PostgreSQL)
- Familiar with Linux/Unix commands and Shell scripting
- Exposure to microservices architecture and CI/CD tools (Jenkins, Maven)
- Hands-on experience with application monitoring and log analysis tools
- Knowledge of cloud (AWS / Azure) environments is a plus
Soft Skills:
- Strong problem-solving and analytical mindset
- Good communication and teamwork skills
- Ability to work under pressure and handle on-call support if required


Required Skills:
- 4+ years of experience designing, developing, and implementing enterprise-level, n-tier, software solutions.
- Proficiency with Microsoft C# is a must.
- In-depth experience with .NET framework and .NET Core.
- Knowledge of OOP, server technologies, and SOA is a must. 3+ Years Micro-service experience .
- Relevant experience with database design and SQL (Postgres is preferred).
- Experience with ORM tooling.
- Experience delivering software that is correct, stable, and security compliant.
- Basic understanding of common cloud platform. (Good to have)
- Financial services experience is strongly preferred.
- Thorough understanding of XML/JSON and related technologies.
- Thorough understanding of unit, integration, and performance testing for APIs.
- Entrepreneurial spirit. You are self-directed, innovative, and biased towards action. You love to build new things and thrive in fast-paced environments.
- Excellent communication and interpersonal skills, with an emphasis on strong writing and analytical problem-solving.

Wissen Technology is hiring for Data Engineer
About Wissen Technology:At Wissen Technology, we deliver niche, custom-built products that solve complex business challenges across industries worldwide. Founded in 2015, our core philosophy is built around a strong product engineering mindset—ensuring every solution is architected and delivered right the first time. Today, Wissen Technology has a global footprint with 2000+ employees across offices in the US, UK, UAE, India, and Australia. Our commitment to excellence translates into delivering 2X impact compared to traditional service providers. How do we achieve this? Through a combination of deep domain knowledge, cutting-edge technology expertise, and a relentless focus on quality. We don’t just meet expectations—we exceed them by ensuring faster time-to-market, reduced rework, and greater alignment with client objectives. We have a proven track record of building mission-critical systems across industries, including financial services, healthcare, retail, manufacturing, and more. Wissen stands apart through its unique delivery models. Our outcome-based projects ensure predictable costs and timelines, while our agile pods provide clients the flexibility to adapt to their evolving business needs. Wissen leverages its thought leadership and technology prowess to drive superior business outcomes. Our success is powered by top-tier talent. Our mission is clear: to be the partner of choice for building world-class custom products that deliver exceptional impact—the first time, every time.
Job Summary:Wissen Technology is hiring a Data Engineer with a strong background in Python, data engineering, and workflow optimization. The ideal candidate will have experience with Delta Tables, Parquet, and be proficient in Pandas and PySpark.
Experience:7+ years
Location:Pune, Mumbai, Bangalore
Mode of Work:Hybrid
Key Responsibilities:
- Develop and maintain data pipelines using Python (Pandas, PySpark).
- Optimize data workflows and ensure efficient data processing.
- Work with Delta Tables and Parquet for data storage and management.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Ensure data quality and integrity throughout the data lifecycle.
- Implement best practices for data engineering and workflow optimization.
Qualifications and Required Skills:
- Proficiency in Python, specifically with Pandas and PySpark.
- Strong experience in data engineering and workflow optimization.
- Knowledge of Delta Tables and Parquet.
- Excellent problem-solving skills and attention to detail.
- Ability to work collaboratively in a team environment.
- Strong communication skills.
Good to Have Skills:
- Experience with Databricks.
- Knowledge of Apache Spark, DBT, and Airflow.
- Advanced Pandas optimizations.
- Familiarity with PyTest/DBT testing frameworks.
Wissen Sites:
- Website: http://www.wissen.com
- LinkedIn: https://www.linkedin.com/company/wissen-technology
- Wissen Leadership: https://www.wissen.com/company/leadership-team/
- Wissen Live: https://www.linkedin.com/company/wissen-technology/posts/feedView=All
- Wissen Thought Leadership: https://www.wissen.com/articles/
Wissen | Driving Digital Transformation
A technology consultancy that drives digital innovation by connecting strategy and execution, helping global clients to strengthen their core technology.
Now Hiring: Tableau Developer (Banking Domain) 🚀
We’re looking for a 6+ years experienced Tableau pro to design and optimize dashboards for Banking & Financial Services.
🔹 Design & optimize interactive Tableau dashboards for large banking datasets
🔹 Translate KPIs into scalable reporting solutions
🔹 Ensure compliance with regulations like KYC, AML, Basel III, PCI-DSS
🔹 Collaborate with business analysts, data engineers, and banking experts
🔹 Bring deep knowledge of SQL, data modeling, and performance optimization
🌍 Location: Remote
📊 Domain Expertise: Banking / Financial Services
✨ Preferred experience with cloud data platforms (AWS, Azure, GCP) & certifications in Tableau are a big plus!
Bring your data visualization skills to transform banking intelligence & compliance reporting.
⚠️ Important Note (Please Read Before Applying):
- Only candidates with 5–8 years of relevant experience should apply.
- Freshers or candidates with less than 5 years of experience – please do not apply.
- Only immediate joiners or candidates currently serving notice will be considered.
Job Title: Java Developer (5–8 Years)
Location: Bangalore (Hybrid Mode)
Experience: 5 to 8 years
Joining: Immediate / Notice Serving Only
About the Role:
We are looking for passionate and highly skilled Java Developers to join our dynamic team in Bangalore. The ideal candidate will have strong expertise in Java, Spring Boot, Collections, Multithreading, and Data Structures & Algorithms (DSA), with proven problem-solving abilities.
Key Responsibilities:
- Design, develop, and maintain high-performance, scalable, and secure applications.
- Work with Spring Boot and related frameworks to build microservices-based solutions.
- Optimize code using Collections & Multithreading concepts for performance and reliability.
- Apply strong DSA and problem-solving skills to deliver efficient solutions.
- Collaborate with cross-functional teams to ensure timely delivery of high-quality software.
- Troubleshoot, debug, and resolve production issues efficiently.
Required Skills & Experience:
- 5–8 years of hands-on experience in Core Java and Spring Boot.
- Strong expertise in Collections, Multithreading, and Concurrency.
- Solid understanding of DSA, Algorithms, and System Design fundamentals.
- Experience in developing REST APIs and Microservices.
- Proficiency in writing clean, maintainable, and efficient code
- .Strong analytical and problem-solving skills.
Why Join Us?
- Opportunity to work on cutting-edge projects with modern architectures.
- Hybrid work setup in Bangalore.
- Fast-paced and growth-driven environment.


Company Description:
NonStop io Technologies, founded in August 2015, is a Bespoke Engineering Studio specializing in Product Development. With over 80 satisfied clients worldwide, we serve startups and enterprises across prominent technology hubs, including San Francisco, New York, Houston, Seattle, London, Pune, and Tokyo. Our experienced team brings over 10 years of expertise in building web and mobile products across multiple industries. Our work is grounded in empathy, creativity, collaboration, and clean code, striving to build products that matter and foster an environment of accountability and collaboration.
Brief Description:
NonStop io is seeking a proficient .NET Developer to join our growing team. You will be responsible for developing, enhancing, and maintaining scalable applications using .NET technologies. This role involves working on a healthcare-focused product and requires strong problem-solving skills, attention to detail, and a passion for software development.
Responsibilities:
- Design, develop, and maintain applications using .NET Core/.NET Framework, C#, and related technologies
- Write clean, scalable, and efficient code while following best practices
- Develop and optimize APIs and microservices
- Work with SQL Server and other databases to ensure high performance and reliability
- Collaborate with cross-functional teams, including UI/UX designers, QA, and DevOps
- Participate in code reviews and provide constructive feedback
- Troubleshoot, debug, and enhance existing applications
- Ensure compliance with security and performance standards, especially for healthcare-related applications
Qualifications & Skills:
- Strong experience in .NET Core/.NET Framework and C#
- Proficiency in building RESTful APIs and microservices architecture
- Experience with Entity Framework, LINQ, and SQL Server
- Familiarity with front-end technologies like React, Angular, or Blazor is a plus
- Knowledge of cloud services (Azure/AWS) is a plus
- Experience with version control (Git) and CI/CD pipelines
- Strong understanding of object-oriented programming (OOP) and design patterns
- Prior experience in healthcare tech or working with HIPAA-compliant systems is a plus
Why Join Us?
- Opportunity to work on a cutting-edge healthcare product
- A collaborative and learning-driven environment
- Exposure to AI and software engineering innovations
- Excellent work ethics and culture
If you're passionate about technology and want to work on impactful projects, we'd love to hear from you!

Data Engineer
Experience: 4–6 years
Key Responsibilities
- Design, build, and maintain scalable data pipelines and workflows.
- Manage and optimize cloud-native data platforms on Azure with Databricks and Apache Spark (1–2 years).
- Implement CI/CD workflows and monitor data pipelines for performance, reliability, and accuracy.
- Work with relational databases (Sybase, DB2, Snowflake, PostgreSQL, SQL Server) and ensure efficient SQL query performance.
- Apply data warehousing concepts including dimensional modelling, star schema, data vault modelling, Kimball and Inmon methodologies, and data lake design.
- Develop and maintain ETL/ELT pipelines using open-source frameworks such as Apache Spark and Apache Airflow.
- Integrate and process data streams from message queues and streaming platforms (Kafka, RabbitMQ).
- Collaborate with cross-functional teams in a geographically distributed setup.
- Leverage Jupyter notebooks for data exploration, analysis, and visualization.
Required Skills
- 4+ years of experience in data engineering or a related field.
- Strong programming skills in Python with experience in Pandas, NumPy, Flask.
- Hands-on experience with pipeline monitoring and CI/CD workflows.
- Proficiency in SQL and relational databases.
- Familiarity with Git for version control.
- Strong communication and collaboration skills with ability to work independently.


About the company
We are the most trusted provider of data collection and management, marketing program management, and analytical solutions for our Crop and Animal Health industry clients. With data services at the core—surrounded by an extensible array of streamlined software solutions—our unified platform represents over three decades of innovation and expertise in the agriculture, crop protection, specialty chemical and animal health industries.
Backed by an entrepreneurial, creative and energetic work force, teammates at AGDATA are pushing the boundaries of technology to enhance our relationships with our clients. We are a growing team, focused on adding creative, knowledgeable individuals who are ready to jump right in and make an immediate impact.
- 30+ years of experience in the Crop and Animal Health industry
- More than 20 billion USD sales processed annually
- Over 2,15,000 payments issued via marketing programs yearly
What’s the role?
If you are looking for an opportunity to solve deep technical problems, build innovative solutions, and work with top-notch software developers in the Pune area, AGDATA might have the role for you.
You must be able to look at the big picture from both business and technology perspective, possess strong analytical, design, and problem-solving skills, and enjoy working with data and algorithms.
You are not afraid of ambiguity, dealing with nebulous requirements, and get excited about difficult challenges.
Our ideal candidate will have...
- 7+ years of software development experience with emphasis on web technologies, cloud computing (Azure preferred), and SaaS
- Deep hands-on experience in Microsoft technologies stack such as .Net 6+, C# (strong knowledge of collections, async await patterns), Web API, windows services, and relational database (MSSQL)
- Proven experience on front end technologies like Angular
- Expertise in RESTful API, SOA, Microservice, AMQP and distributed architecture and design
- Ability to understand complex data relationships
- Experience in Unit Testing
- Experience in Azure cloud services/ Azure DevOps
- Demonstrated skill in aligning application decisions to an overarching solution and systems architecture
- Structured thinker, effective communicator, with excellent programming and analytic skills
In this role, you will ...
- Take your problem-solving skills and expertise in system design to the next level by delivering innovative solutions
- Actively contribute to the development process by writing high-quality code
- Utilize your full stack development skills and work with diverse technologies to deliver outstanding results
- Adapt quickly to new technologies and leverage your past experiences to stay ahead
- Exhibit a passion for building software and delivering high-quality products, prioritizing user experience
- Engage in all phases of the software development life cycle, including design, implementation, and unit testing
- Think from the perspective of our customers, optimizing their experience with our software
How AGDATA will support you:
Supporting your health & well-being:
- Comprehensive medical coverage – up to INR 7.5 lakh for employee and dependents, including parents
- OPD benefit – coverage of up to INR 15 thousand covering expenses across specialties
- Paternity leave up to 14 working days with the option to split leave
Emphasizing work life balance: Flexible hybrid work policy
Experiencing a work culture that promotes from within: In 2023, 14% of our associates were promoted internally
Being comfortable in the office: Coming into our brand-new office space? Free snacks and top class facilities will be available
AccioJob is conducting a Walk-In Hiring Drive with HummingBird Technologies for the position of Java Backend Developer.
To apply, register and select your slot here: https://go.acciojob.com/wNrG3R
Required Skills: DSA, OOPs, SQL, Rest API
Eligibility:
- Degree: BTech./BE, MTech./ME, BCA, MCA, BSc., MSc
- Branch: Computer Science/CSE/Other CS related branch, Electrical/Other electrical related branches, IT
- Graduation Year: 2026
Work Details:
- Work Location: Pune (Onsite)
- CTC: 9 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Pune Centre
Further Rounds (for shortlisted candidates only):
- Profile Evaluation, Technical Interview 1
- Technical Interview 2
- HR Discussion
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/wNrG3R
AccioJob is conducting a Walk-In Hiring Drive with HummingBird Technologies for the position of Java Backend Developer.
To apply, register and select your slot here: https://go.acciojob.com/gqHtdK
Required Skills: DSA, OOPs, SQL, Rest API
Eligibility:
- Degree: BTech./BE, MTech./ME, BCA, MCA, BSc., MSc
- Branch: Computer Science/CSE/Other CS related branch, Electrical/Other electrical related branches, IT
- Graduation Year: 2024, 2025
Work Details:
- Work Location: Pune (Onsite)
- CTC: 9 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Pune Centre
Further Rounds (for shortlisted candidates only):
- Profile Evaluation, Technical Interview 1
- Technical Interview 2
- HR Discussion
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/gqHtdK
- 8+ years of Data Engineering experience
- Strong SQL and Redshift experience
- CI/CD and orchestration experience using Bitbucket, Jenkins and Control-M
- Reporting experience preferably Tableau
- Location – Pune, Hyderabad, Bengaluru
Job Description – Java Developer
Role: Java Developer
Location: Pune / Mumbai
Experience: 5 to 10 years
Required Skills:
We are looking for an experienced Java Developer with strong expertise in Core Java, Spring, Spring Boot, and Hibernate. The candidate should have solid experience in designing, developing, and deploying enterprise-grade applications, with strong understanding of OOPs concepts, data structures, and algorithms. Hands-on experience with RESTful APIs, Microservices, and Database technologies (MySQL/Oracle/SQL Server) is essential.
The ideal candidate should be well-versed in version control systems (Git), build tools (Maven/Gradle), and CI/CD pipelines (Jenkins). Exposure to cloud platforms (AWS/Azure/GCP) and containerization (Docker/Kubernetes) will be a strong plus.
Key Responsibilities:
- Design, develop, and maintain scalable and high-performance applications.
- Write clean, reusable, and efficient code following best practices.
- Collaborate with cross-functional teams to deliver quality solutions.
- Perform code reviews, debugging, and performance tuning.
- Ensure application security, reliability, and scalability.
Good To Have Skills:
- Knowledge of front-end technologies (JavaScript, Angular/React).
- Familiarity with Agile/Scrum methodologies.
- Strong problem-solving and analytical skills.


Job Title: Data Engineering Support Engineer / Manager
Experience range:-8+ Years
Location:- Mumbai
Experience :
Knowledge, Skills and Abilities
- Python, SQL
- Familiarity with data engineering
- Experience with AWS data and analytics services or similar cloud vendor services
- Strong problem solving and communication skills
- Ablity to organise and prioritise work effectively
Key Responsibilities
- Incident and user management for data and analytics platform
- Development and maintenance of Data Quliaty framework (including anomaly detection)
- Implemenation of Python & SQL hotfixes and working with data engineers on more complex issues
- Diagnostic tools implementation and automation of operational processes
Key Relationships
- Work closely with data scientists, data engineers, and platform engineers in a highly commercial environment
- Support research analysts and traders with issue resolution

Location: Krishnagiri, Tamil Nadu
Experience: Minimum 2 years
Job Type: Full-time
Preferred Candidate: Female
About the Role:
We are seeking a dedicated and enthusiastic Computer Teacher to join our academic team. The ideal candidate will have at least 2 years of teaching experience, strong communication skills, and a passion for imparting computer knowledge to students at the school/college level.
Key Responsibilities:
- Deliver computer science curriculum to students as per academic guidelines.
- Teach foundational topics such as MS Office, Internet, HTML, and basic programming (e.g., Scratch, Python, C – as applicable).
- Plan and execute interactive lessons using digital teaching tools.
- Conduct practical sessions in the computer lab.
- Assess students’ progress through assignments, tests, and projects.
- Maintain attendance, grades, and student performance records.
- Encourage students to participate in tech-based activities and competitions.
- Collaborate with school/college staff for curriculum planning and development.
- Provide basic technical support for classroom technology when needed.
Required Qualifications:
- Bachelor’s degree in Computer Science, BCA, or any relevant discipline.
- B.Ed (preferred for school teaching roles).
- Minimum 2 years of teaching experience in a school or academic institution.
- Good command of English and Tamil (or local language as required).
- Strong classroom management and communication skills.
Preferred Qualities:
- Female candidates are preferred for this role.
- Ability to adapt teaching methods based on student needs.
- Familiarity with smart classroom tools and e-learning platforms.
- Passion for education and mentoring young minds.
Working Hours:
- Monday , Tuesday, Thursday to Saturday
- Timings: 9:00 AM to 4:00 PM
Salary Range:
₹15,000 – ₹20,000/month (based on experience and qualification)



Pay: ₹70,000.00 - ₹90,000.00 per month
Job description:
Name of the College: KGiSL Institute of Technology
College Profile: The main objective of KGiSL Institute of Technology is to provide industry embedded education and to mold the students for leadership in industry, government, and educational institutions; to advance the knowledge base of the engineering professions; and to influence the future directions of engineering education and practice. The ability to connect to the future challenges and deliver industry-ready human resources is a credibility that KGISL Educational Institutions have progressively excelled at. Industry -readiness of its students is what will eventually elevate an institution to star status and its competitiveness in the job market. Choice of such an institution will depend on its proximity to industry, the relevance of its learning programme to real-time industry and the active connect that a student will have with industry professionals.
Job Title: Assistant Professor / Associate Professor
Departments:
● CSE
Qualification:
● ME/M.Tech/Ph.D(Ph.D must for Associate Professor)
Experience:
● Freshers can Apply● Experience - 8-10 Years
Key Responsibilities:
1. Teaching & Learning:
Deliver high-quality lectures and laboratory sessions in core and advanced areas of Computer Science & Engineering.
Prepare lesson plans, teaching materials, and assessment tools as per the approved curriculum.
Adopt innovative teaching methodologies, including ICT-enabled learning and outcome-based education (OBE).
2. Research & Publications:
Conduct independent and collaborative research in areas of specialization.
Publish research papers in peer-reviewed journals and present in reputed conferences.
Eligibility & Qualifications (As per AICTE/UGC Norms):
Educational Qualification: Ph.D. in Computer Science & Engineering or relevant discipline.
Experience: Minimum of 8 years teaching/research/industry experience, with at least 3 years at the level of Assistant Professor.
Research: Minimum of 7 publications in refereed journals as per UGC-CARE list and at least one Ph.D. degree awarded or ongoing under supervision.
Other Requirements:
Good academic record throughout.
Proven ability to attract research funding.
Strong communication and interpersonal skills.
Work Location: [ KGiSL Campus]
Employment Type: Full-time / Permanent
Joining time: immediately
Job Type: Full-time
Benefits:
- Health insurance
- Life insurance
- Provident Fund
Work Location: In person

Responsibilities • Design, develop, and maintain backend systems and RESTful APIs using Python (Django, FastAPI, or Flask)• Build real-time communication features using WebSockets, SSE, and async IO • Implement event-driven architectures using messaging systems like Kafka, RabbitMQ, Redis Streams, or NATS • Develop and maintain microservices that interact over messaging and streaming protocols • Ensure high scalability and availability of backend services • Collaborate with frontend developers, DevOps engineers, and product managers to deliver end-to-end solutions • Write clean, maintainable code with unit/integration tests • Lead technical discussions, review code, and mentor junior engineers
Requirements • 8+ years of backend development experience, with at least 8 years in Python • Strong experience with asynchronous programming in Python (e.g., asyncio, aiohttp, FastAPI) • Production experience with WebSockets and Server-Sent Events • Hands-on experience with at least one messaging system: Kafka, RabbitMQ, Redis Pub/Sub, or similar • Proficient in RESTful API design and microservices architecture • Solid experience with relational and NoSQL databases • Familiarity with Docker and container-based deployment • Strong understanding of API security, authentication, and performance optimization
Nice to Have • Experience with GraphQL or gRPC • Familiarity with stream processing frameworks (e.g., Apache Flink, Spark Streaming) • Cloud experience (AWS, GCP, Azure), particularly with managed messaging or pub/sub services • Knowledge of CI/CD and infrastructure as code • Exposure to AI engineering workflows and tools • Interest or experience in building Agentic AI systems or integrating backends with AI agents
Job Description
3-5 years of hands-on experience in manual testing involving functional, non-functional, regression, and integration testing in a structured environment.
Candidate should have exceptional communication skills.
Should have minimum 1 year work experience in data comparison testing.
Experience in testing web-based applications.
Able to define the scope of testing.
Experience in testing large-scale solutions integrating multiple source and target systems.
Experience in API testing.
Experience in Database verification using SQL queries.
Experience working in an Agile team.
Should be able to attend Agile ceremonies in UK hours.
Having a good understanding of Data Migration projects will be a plus.

What We’re Looking For:
- Strong experience in Python (5+ years).
- Hands-on experience with any database (SQL or NoSQL).
- Experience with frameworks like Flask, FastAPI, or Django.
- Knowledge of ORMs, API development, and unit testing.
- Familiarity with Git and Agile methodologies.
- Familiarity with the Kafka tool (Added Advantage)

What We’re Looking For:
- Strong experience in Python (4+ years).
- Hands-on experience with any database (SQL or NoSQL).
- Experience with frameworks like Flask, FastAPI, or Django.
- Knowledge of ORMs, API development, and unit testing.
- Familiarity with Git and Agile methodologies.
- Familiarity with the Kafka tool (Added Advantage)
Job Summary:
We are looking for a skilled and motivated Backend Engineer with 2 to 4 years of professional experience to join our dynamic engineering team. You will play a key role in designing, building, and maintaining the backend systems that power our products. You’ll work closely with cross-functional teams to deliver scalable, secure, and high-performance solutions that align with business and user needs.
This role is ideal for engineers ready to take ownership of systems, contribute to architectural decisions, and solve complex backend challenges.
Website: https://www.thealteroffice.com/about
Key Responsibilities:
- Design, build, and maintain robust backend systems and APIs that are scalable and maintainable.
- Collaborate with product, frontend, and DevOps teams to deliver seamless, end-to-end solutions.
- Model and manage data using SQL (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Redis), incorporating caching where needed.
- Implement and manage authentication, authorization, and data security practices.
- Write clean, well-documented, and well-tested code following best practices.
- Work with cloud platforms (AWS, GCP, or Azure) to deploy, monitor, and scale services effectively.
- Use tools like Docker (and optionally Kubernetes) for containerization and orchestration of backend services.
- Maintain and improve CI/CD pipelines for faster and safer deployments.
- Monitor and debug production issues, using observability tools (e.g., Prometheus, Grafana, ELK) for root cause analysis.
- Participate in code reviews, contribute to improving development standards, and provide support to less experienced engineers.
- Work with event-driven or microservices-based architecture, and optionally use technologies like GraphQL, WebSockets, or message brokers such as Kafka or RabbitMQ when suitable for the solution.
Requirements:
- 2 to 4 years of professional experience as a Backend Engineer or similar role.
- Proficiency in at least one backend programming language (e.g., Python, Java, Go, Ruby, etc.).
- Strong understanding of RESTful API design, asynchronous programming, and scalable architecture patterns.
- Solid experience with both relational and NoSQL databases, including designing and optimizing data models.
- Familiarity with Docker, Git, and modern CI/CD workflows.
- Hands-on experience with cloud infrastructure and deployment processes (AWS, GCP, or Azure).
- Exposure to monitoring, logging, and performance profiling practices in production environments.
- A good understanding of security best practices in backend systems.
- Strong problem-solving, debugging, and communication skills.
- Comfortable working in a fast-paced, agile environment with evolving priorities.


Role Overview:
We are seeking a highly skilled and experienced Lead Web App Developer - Backend to join our dynamic team in Bengaluru. The ideal candidate will have a strong background in backend development, microservices architecture, and cloud technologies, with a proven ability to deliver robust, scalable solutions. This role involves designing, implementing, and maintaining complex distributed systems, primarily in the Mortgage Finance domain.
Key Responsibilities:
- Cloud-Based Web Applications Development:
- Lead backend development efforts for cloud-based web applications.
- Work on diverse projects within the Mortgage Finance domain.
- Microservices Design & Development:
- Design and implement microservices-based architectures.
- Ensure scalability, availability, and reliability of distributed systems.
- Programming & API Development:
- Write efficient, reusable, and maintainable code in Python, Node.js, and Java.
- Develop and optimize RESTful APIs.
- Infrastructure Management:
- Leverage AWS platform infrastructure to build secure and scalable solutions.
- Utilize tools like Docker for containerization and deployment.
- Database Management:
- Work with RDBMS (MySQL) and NoSQL databases to design efficient schemas and optimize queries.
- Team Collaboration:
- Collaborate with cross-functional teams to ensure seamless integration and delivery of projects.
- Mentor junior developers and contribute to the overall skill development of the team.
Core Requirements:
- Experience: Minimum 10+ years in backend development, with at least 3+ years of experience in designing and delivering large-scale products on microservices architecture.
- Technical Skills:
- Programming Languages: Python, Node.js, Java.
- Frameworks & Tools: AWS (Lambda, RDS, etc.), Docker.
- Database Expertise: Proficiency in RDBMS (MySQL) and NoSQL databases.
- API Development: Hands-on experience in developing REST APIs.
- System Design: Strong understanding of distributed systems, scalability, and availability.
Additional Skills (Preferred):
- Experience with modern frontend frameworks like React.js or AngularJS.
- Strong design and architecture capabilities.
What We Offer:
- Opportunity to work on cutting-edge technologies in a collaborative environment.
- Competitive salary and benefits package.
- Flexible hybrid working model.
- Chance to contribute to impactful projects in the Mortgage Finance domain.

Role Overview
We are looking for a highly skilled Product Engineer to join our dynamic team. This is an exciting opportunity to work on innovative FinTech solutions and contribute to the future of global payments. If you're passionate about backend development, API design, and scalable architecture, we'd love to hear from you!
Key Responsibilities
- Design, develop, and maintain scalable, high-performance backend systems.
- Write clean, maintainable, and efficient code while following best practices.
- Build and optimize RESTful APIs and database queries.
- Collaborate with cross-functional teams to deliver 0 to 1 products.
- Ensure smooth CI/CD pipeline implementation and deployment automation.
- Contribute to open-source projects and stay updated with industry trends.
- Maintain a strong focus on security, performance, and reliability.
- Work with payment protocols and financial regulations to ensure compliance.
Required Skills & Qualifications
- ✅ 3+ years of professional software development experience.
- ✅ Proficiency in any backend language (with preference for Ruby on Rails).
- ✅ Strong foundation in architecture, design, and database optimization.
- ✅ Experience in building APIs and working with SQL/NoSQL databases.
- ✅ Familiarity with CI/CD practices and automation tools.
- ✅ Excellent problem-solving and analytical skills.
- ✅ Strong track record of open-source contributions (minimum 50 stars on GitHub).
- ✅ Passion for FinTech and payment systems.
- ✅ Strong communication skills and ability to work collaboratively in a team.
Nice to Have
- Prior experience in financial services or payment systems.
- Exposure to microservices architecture and cloud platforms.
- Knowledge of containerization tools like Docker & Kubernetes.

Role overview
- Overall 5 to 7 years of experience. Node.js experience is must.
- At least 3+ years of experience or couple of large-scale products delivered on microservices.
- Strong design skills on microservices and AWS platform infrastructure.
- Excellent programming skill in Python, Node.js and Java.
- Hands on development in rest API’s.
- Good understanding of nuances of distributed systems, scalability, and availability.
What would you do here
- To Work as a Backend Developer in developing Cloud Web Applications
- To be part of the team working on various types of web applications related to Mortgage Finance.
- Experience in solving a real-world problem of Implementing, Designing and helping develop a new Enterprise-class Product from ground-up.
- You have expertise in the AWS Cloud Infrastructure and Micro-services architecture around the AWS Service stack like Lambdas, SQS, SNS, MySQL Databases along with Dockers and containerized solutions/applications.
- Experienced in Relational and No-SQL databases and scalable design.
- Experience in solving challenging problems by developing elegant, maintainable code.
- Delivered rapid iterations of software based on user feedback and metrics.
- Help the team make key decisions on our product and technology direction.
- You actively contribute to the adoption of frameworks, standards, and new technologies.

AccioJob is conducting a Walk-In Hiring Drive with Infrrd for the position of Java Full Stack Developer.
To apply, register and select your slot here: https://go.acciojob.com/3UTekG
Required Skills: DSA, OOPS, SQL, Java, Python
Eligibility:
- Degree: BTech./BE, MTech./ME
- Branch: Computer Science/CSE/Other CS related branch, IT
- Graduation Year: 2026
Work Details:
- Work Location: Bangalore (Onsite)
- Stipend Range: 30k
- Stipend Duration: 12 Months
- CTC: 6 LPA to 9 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Bangalore Centre
Further Rounds (for shortlisted candidates only):
- Profile Evaluation
- Technical Interview 1
- Technical Interview 2
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/3UTekG

Roles and responsibilities-
- Tech-lead in one of the feature teams, candidate need to be work along with team lead in handling the team without much guidance
- Good communication and leadership skills
- Nurture and build next level talent within the team
- Work in collaboration with other vendors and client development team(s)
- Flexible to learn new tech areas
- Lead complete lifecycle of feature - from feature inception to solution, story grooming, delivery, and support features in production
- Ensure and build the controls and processes for continuous delivery of applications, considering all stages of the process and its automations
- Interact with teammates from across the business and comfortable explaining technical concepts to nontechnical audiences
- Create robust, scalable, flexible, and relevant solutions that help transform product and businesses
Must haves:
- Spark
- Scala
- Postgres(or any SQL DB)s
- Elasticsearch(or any No-SQL DB)
- Azure (if not, any other cloud experience)
- Big data processing
Good to have:
- Golang
- Databricks
- Kubernetes

We’re hiring a Full Stack Developer (5+ years, Pune location) to join our growing team!
You’ll be working with React.js, Node.js, JavaScript, APIs, and cloud deployments to build scalable and high-performing web applications.
Responsibilities include developing responsive apps, building RESTful APIs, working with SQL/NoSQL databases, and deploying apps on AWS/Docker.
Experience with CI/CD, Git, secure coding practices (OAuth/JWT), and Agile collaboration is a must.
If you’re passionate about full stack development and want to work on impactful projects, we’d love to connect!
Position: Tableau Developer
Experience: 5-7 years
Location: Bangalore
Key Responsibilities:
· Design, develop, and maintain interactive dashboards and reports using Tableau, ensuring high-quality visualizations that meet business requirements.
· Write and optimize complex SQL queries to extract, manipulate, and analyse data from various sources, ensuring data integrity and accuracy.
· Stay updated on technologies and trends related to data visualization and analytics, including advanced analytics, big data, and data science. Familiarity with tools such as R, Python, and SAS is a plus.
· Utilize Snowflake for data warehousing solutions, including data modelling, ETL processes, and performance tuning to support Tableau reporting.
· Work effectively in interdisciplinary global teams, influencing stakeholders within a matrix organization to ensure alignment on reporting solutions.
· Incorporate Tableau best practices in reporting solutions and guide team members in their use to enhance overall reporting quality.
· Utilize excellent analytical and problem-solving skills to address data-related challenges and provide actionable insights.
· Communicate effectively with both technical and non-technical stakeholders to understand their reporting needs and deliver tailored solutions.
· Additional Skills: Experience with other visualization tools (e.g., Spotfire, Power BI) and programming languages (e.g., R, Python, JavaScript) is advantageous.
Qualifications:
· Bachelor’s degree in informatics, Information Systems, Data Science, or a related field.
· 5+ years of relevant professional experience in data analytics, performance management, or related fields.
· Strong understanding of clinical development and/or biopharma industry practices is preferred.
· Proven experience in completing computerized systems validation and testing methodologies, with an awarenes
About the company
Sigmoid is a leading data solutions company that partners with Fortune 500 enterprises to drive digital transformation through AI, big data, and cloud technologies. With a focus on scalability, performance, and innovation, Sigmoid delivers cutting-edge solutions to solve complex business challenges.
About the role
You will be responsible for building a highly scalable, extensible, and robust application. This position reports to the Engineering Manager.
Responsibilities:
- Align Sigmoid with key Client initiatives
- Interface daily with customers across leading Fortune 500 companies to understand strategic requirements
- Ability to understand business requirements and tie them to technology solutions
- Open to work from client location as per the demand of the project / customer
- Facilitate in Technical Aspects
- Develop and evolve highly scalable and fault-tolerant distributed components using Java technologies
- Excellent experience in Application development and support, integration development and quality assurance
- Provide technical leadership and manage it day to day basis
- Interface daily with customers across leading Fortune 500 companies to understand strategic requirements
- Stay up-to-date on the latest technology to ensure the greatest ROI for customer & Sigmoid
- Hands on coder with good understanding on enterprise level code
- Design and implement APIs, abstractions and integration patterns to solve challenging distributed computing problems
- Experience in defining technical requirements, data extraction, data transformation, automating jobs, productionizing jobs, and exploring new big data technologies within a Parallel Processing environment
Culture:
- Must be a strategic thinker with the ability to think unconventional / out-of-box
- Analytical and solution driven orientation
- Raw intellect, talent and energy are critical
- Entrepreneurial and Agile: understands the demands of a private, high growth company
- Ability to be both a leader and hands-on "doer"
Qualifications:
- Years of track record of relevant work experience and a computer Science or a related technical discipline is required
- Experience in development of Enterprise scale applications and capable in developing framework, design patterns etc. Should be able to understand and tackle technical challenges and propose comprehensive solutions
- Experience with functional and object-oriented programming, Java or Python is a must
- Experience in Springboot, API, SQL
- Good to have: GIT, Airflow, Node JS, Python, Angular
- Experience with database modeling and development, data mining and warehousing
- Unit, Integration and User Acceptance Testing
- Effective communication skills (both written and verbal)
- Ability to collaborate with a diverse set of engineers, data scientists and product managers
- Comfort in a fast-paced start-up environment
- Experience in Agile methodology
- Proficient with SQL and its variation among popular databases
- Develop, and maintain Java applications using Core Java, Spring framework, JDBC, and threading concepts.
- Strong understanding of the Spring framework and its various modules.
- Experience with JDBC for database connectivity and manipulation
- Utilize database management systems to store and retrieve data efficiently.
- Proficiency in Core Java8 and thorough understanding of threading concepts and concurrent programming.
- Experience in in working with relational and nosql databases.
- Basic understanding of cloud platforms such as Azure and GCP and gain experience on DevOps practices is added advantage.
- Knowledge of containerization technologies (e.g., Docker, Kubernetes)
- Perform debugging and troubleshooting of applications using log analysis techniques.
- Understand multi-service flow and integration between components.
- Handle large-scale data processing tasks efficiently and effectively.
- Hands on experience using Spark is an added advantage.
- Good problem-solving and analytical abilities.
- Collaborate with cross-functional teams to identify and solve complex technical problems.
- Knowledge of Agile methodologies such as Scrum or Kanban
- Stay updated with the latest technologies and industry trends to improve development processes continuously and methodologies
If interested please share your resume with details :
Total Experience -
Relevant Experience in Java,Spring,Data structures,Alogorithm,SQL, -
Relevant Experience in Cloud - AWS/Azure/GCP -
Current CTC -
Expected CTC -
Notice Period -
Reason for change -


Job Title: Backend Engineer – Python / Golang / Rust
Location: Bangalore, India
Experience Required: Minimum 2–3 years
About the Role
We are looking for a passionate Backend Engineer to join our growing engineering team. The ideal candidate should have hands-on experience in building enterprise-grade, scalable backend systems using microservices architecture. You will work closely with product, frontend, and DevOps teams to design, develop, and optimize robust backend solutions that can handle high traffic and ensure system reliability.
Key Responsibilities
• Design, develop, and maintain scalable backend services and APIs.
• Architect and implement microservices-based systems ensuring modularity and resilience.
• Optimize application performance, database queries, and service scalability.
• Collaborate with frontend engineers, product managers, and DevOps teams for seamless delivery.
• Implement security best practices and ensure data protection compliance.
• Write and maintain unit tests, integration tests, and documentation.
• Participate in code reviews, technical discussions, and architecture design sessions.
• Monitor, debug, and improve system performance in production environments.
Required Skills & Experience
• Programming Expertise:
• Advanced proficiency in Python (Django, FastAPI, or Flask), OR
• Strong experience in Golang or Rust for backend development.
• Microservices Architecture: Hands-on experience in designing and maintaining distributed systems.
• Database Management: Expertise in PostgreSQL, MySQL, MongoDB, including schema design and optimization.
• API Development: Strong experience in RESTful APIs and GraphQL.
• Cloud Platforms: Proficiency with AWS, GCP, or Azure for deployment and scaling.
• Containerization & Orchestration: Solid knowledge of Docker and Kubernetes.
• Messaging & Caching: Experience with Redis, RabbitMQ, Kafka, and caching strategies (Redis, Memcached).
• Version Control: Strong Git workflows and collaboration in team environments.
• Familiarity with CI/CD pipelines, DevOps practices, and cloud-native deployments.
• Proven experience working on production-grade, high-traffic applications.
Preferred Qualifications
• Understanding of software architecture patterns (event-driven, CQRS, hexagonal, etc.).
• Experience with Agile/Scrum methodologies.
• Contributions to open-source projects or strong personal backend projects.
• Experience with observability tools (Prometheus, Grafana, ELK, Jaeger).
Why Join Us?
• Work on cutting-edge backend systems that power enterprise-grade applications.
• Opportunity to learn and grow with a fast-paced engineering team.
• Exposure to cloud-native, microservices-based architectures.
• Collaborative culture that values innovation, ownership, and technical excellence.

Holistic technology solutions for the entertainment and leisure industry.


Required Qualifications & Skills:
- Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience.
- Minimum of 5+ years of professional experience in backend software development, with a strong focus on systems using . NET Framework /. NET Core (C#).
- Proven, significant experience in backend software development using the . NET framework /. NET Core (C#, ASP. NET Web API, Entity Framework).
- Expert-level proficiency in T-SQL, including writing and optimizing complex queries, stored procedures, functions, and understanding database design principles (preferably with MS SQL Server).
- Demonstrated ability to read, understand, and analyze complex legacy code to determine functionality and troubleshoot issues effectively.
- Strong analytical and problem-solving skills, with experience in debugging and resolving production system issues.
- Experience in leading technical initiatives or mentoring junior engineers.
- Formal team lead experience is a strong plus, if for Lead level role.
- Solid understanding of software development lifecycle (SDLC), APIs, and backend architecture patterns.
- Excellent communication and interpersonal skills.
Responsibilities:
- Lead and Mentor: Guide, manage, and mentor a backend engineering team of approximately 4 members, fostering a collaborative and high-performing environment.
- Backend Development: Architect, design, develop, test, and deploy scalable and maintainable backend services, APIs, and database solutions using. NET (C#) and T-SQL.
- Enhancement Ownership: Take ownership of technical design and implementation for new features and enhancements requested for our POS and web platforms.
- Legacy System Expertise: Dive into existing codebases (.NET and T-SQL) to thoroughly understand current system functionality, data flows, and business logic, becoming a subject matter expert.
- Production Support: Act as a key escalation point for diagnosing and resolving complex production issues related to the backend systems. Perform root cause analysis and implement effective solutions.
- Technical Guidance: Provide technical direction, conduct code reviews, establish and enforce coding standards and best practices within the team.
- System Knowledge & Communication: Clearly articulate how backend systems work and answer technical questions from team members and other stakeholders.
- Collaboration: Work closely with front-end developers, QA testers, product managers, and potentially clients to deliver integrated solutions
About the Role
We are seeking motivated Data Engineering Interns to join our team remotely for a 3-month internship. This role is designed for students or recent graduates interested in working with data pipelines, ETL processes, and big data tools. You will gain practical experience in building scalable data solutions. While this is an unpaid internship, interns who successfully complete the program will receive a Completion Certificate and a Letter of Recommendation.
Responsibilities
- Assist in designing and building data pipelines for structured and unstructured data.
- Support ETL (Extract, Transform, Load) processes to prepare data for analytics.
- Work with databases (SQL/NoSQL) for data storage and retrieval.
- Help optimize data workflows for performance and scalability.
- Collaborate with data scientists and analysts to ensure data quality and consistency.
- Document workflows, schemas, and technical processes.
Requirements
- Strong interest in data engineering, databases, and big data systems.
- Basic knowledge of SQL and relational database concepts.
- Familiarity with Python, Java, or Scala for data processing.
- Understanding of ETL concepts and data pipelines.
- Exposure to cloud platforms (AWS, Azure, or GCP) is a plus.
- Familiarity with big data frameworks (Hadoop, Spark, Kafka) is an advantage.
- Good problem-solving skills and ability to work independently in a remote setup.
What You’ll Gain
- Hands-on experience in data engineering and ETL pipelines.
- Exposure to real-world data workflows.
- Mentorship and guidance from experienced engineers.
- Completion Certificate upon successful completion.
- Letter of Recommendation based on performance.
Internship Details
- Duration: 3 months
- Location: Remote (Work from Home)
- Stipend: Unpaid
- Perks: Completion Certificate + Letter of Recommendation


🚀 Hiring: Python Full Stack Developer
⭐ Experience: 4+ Years
📍 Location: Gurgaon
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
🎇 About the Role:-
We are looking for an experienced Python Full Stack Developer (Backend Focus) with 4–6 years of experience to join our dynamic team. You will play a key role in backend development, API design, and data processing, while also contributing to frontend tasks when needed. This position provides excellent opportunities for growth and exposure to cutting-edge technologies.
✨ Required Skills & Experience
✅ Backend Development: Python (Django/Flask), MVC patterns
✅ Databases: SQL, PostgreSQL/MySQL
✅ API Development: RESTful APIs
✅ Testing: pytest, unittest, TDD
✅ Version Control: Git workflows
✅ Frontend Basics: React, JavaScript
✅ DevOps & Tools: Docker basics, CI/CD concepts, JSON/XML/CSV handling
✅ Cloud: Basic Azure knowledge
Profile: AWS Data Engineer
Mandate skills :AWS + Databricks + Pyspark + SQL role
Location: Bangalore/Pune/Hyderabad/Chennai/Gurgaon:
Notice Period: Immediate
Key Requirements :
- Design, build, and maintain scalable data pipelines to collect, process, and store from multiple datasets.
- Optimize data storage solutions for better performance, scalability, and cost-efficiency.
- Develop and manage ETL/ELT processes to transform data as per schema definitions, apply slicing and dicing, and make it available for downstream jobs and other teams.
- Collaborate closely with cross-functional teams to understand system and product functionalities, pace up feature development, and capture evolving data requirements.
- Engage with stakeholders to gather requirements and create curated datasets for downstream consumption and end-user reporting.
- Automate deployment and CI/CD processes using GitHub workflows, identifying areas to reduce manual, repetitive work.
- Ensure compliance with data governance policies, privacy regulations, and security protocols.
- Utilize cloud platforms like AWS and work on Databricks for data processing with S3 Storage.
- Work with distributed systems and big data technologies such as Spark, SQL, and Delta Lake.
- Integrate with SFTP to push data securely from Databricks to remote locations.
- Analyze and interpret spark query execution plans to fine-tune queries for faster and more efficient processing.
- Strong problem-solving and troubleshooting skills in large-scale distributed systems.

• Data Pipeline Development: Design and implement scalable data pipelines using PySpark and Databricks on AWS cloud infrastructure
• ETL/ELT Operations: Extract, transform, and load data from various sources using Python, SQL, and PySpark for batch and streaming data processing
• Databricks Platform Management: Develop and maintain data workflows, notebooks, and clusters in Databricks environment for efficient data processing
• AWS Cloud Services: Utilize AWS services including S3, Glue, EMR, Redshift, Kinesis, and Lambda for comprehensive data solutions
• Data Transformation: Write efficient PySpark scripts and SQL queries to process large-scale datasets and implement complex business logic
• Data Quality & Monitoring: Implement data validation, quality checks, and monitoring solutions to ensure data integrity across pipelines
• Collaboration: Work closely with data scientists, analysts, and other engineering teams to support analytics and machine learning initiatives
• Performance Optimization: Monitor and optimize data pipeline performance, query efficiency, and resource utilization in Databricks and AWS environments
Required Qualifications:
• Experience: 3+ years of hands-on experience in data engineering, ETL development, or related field
• PySpark Expertise: Strong proficiency in PySpark for large-scale data processing and transformations
• Python Programming: Solid Python programming skills with experience in data manipulation libraries (pandas etc)
• SQL Proficiency: Advanced SQL skills including complex queries, window functions, and performance optimization
• Databricks Experience: Hands-on experience with Databricks platform, including notebook development, cluster management, and job scheduling
• AWS Cloud Services: Working knowledge of core AWS services (S3, Glue, EMR, Redshift, IAM, Lambda)
• Data Modeling: Understanding of dimensional modeling, data warehousing concepts, and ETL best practices
• Version Control: Experience with Git and collaborative development workflows
Preferred Qualifications:
• Education: Bachelor's degree in Computer Science, Engineering, Mathematics, or related technical field
• Advanced AWS: Experience with additional AWS services like Athena, QuickSight, Step Functions, and CloudWatch
• Data Formats: Experience working with various data formats (JSON, Parquet, Avro, Delta Lake)
• Containerization: Basic knowledge of Docker and container orchestration
• Agile Methodology: Experience working in Agile/Scrum development environments
• Business Intelligence Tools: Exposure to BI tools like Tableau, Power BI, or Databricks SQL Analytics
Technical Skills Summary:
Core Technologies:
- PySpark & Spark SQL
- Python (pandas, boto3)
- SQL (PostgreSQL, MySQL, Redshift)
- Databricks (notebooks, clusters, jobs, Delta Lake)
AWS Services:
- S3, Glue, EMR, Redshift
- Lambda, Athena
- IAM, CloudWatch
Development Tools:
- Git/GitHub
- CI/CD pipelines, Docker
- Linux/Unix command line
Location & Work Model:
- Position: Contract to hire
- Bangalore - Marathahalli;
- Work mode - Work from Office; all 5 days
- Looking for immediate joiners
Technical Requirements:
- Strong experience in Java Backend Development
- Proficiency in both SQL & NoSQL databases (e.g., MongoDB, PostgreSQL, MySQL)
- Basic knowledge of DevOps tools (CI/CD pipeline)
- Familiarity with cloud providers (AWS, GCP)
- Ability to quickly learn and adapt to emerging technologies, including AI-driven tools, and automation solutions
- Strong problem-solving mindset with an interest in leveraging AI and data-driven approaches for backend optimizations
Soft Skills & Mindset:
- Strong communication & articulation skills
- Structured thought process and a logical approach to problem-solving
- Interest & willingness to learn and adapt to new technologies
Roles and Responsibilities:
- Develop high-quality code and employ object-oriented design principles while strictly adhering to best coding practices.
- Ability to work independently
- Demonstrate substantial expertise in Core Java, Multithreading, and Spring/Spring Boot frameworks.
- Possess a solid understanding of Spring, Hibernate, Caching Frameworks, and Memory Management.
- Proficient in crafting complex analytical SQL queries to meet project requirements.
- Contribute to the development of Highly Scalable applications.
- Design and implement Rest-based applications with efficiency and precision.
- Create comprehensive unit tests using frameworks such as Junit and Mockito.
- Engage in the Continuous Integration/Continuous Deployment (CI/CD) process and utilize build tools like Git and Maven.
- Familiarity with any Cloud service provider is considered an added advantage.
Required Skills and Experience :
- Experience in cloud platform AWS is necessary.
- Experience in big data processing frameworks like Apache Spark, Flink, Kafka, etc
- In-depth proficiency in Java, Spring Boot, Spring Frameworks, Hibernate, SQL, and Unit Testing frameworks.
- Good knowledge of SQL and experience with complex queries is a must
- Experience in supporting and debugging issues in the production environment is a must.
- Experience in Analytical databases like Redshift, Big Query, Snowflake, Clickhouse, etc is a plus.

Job Title: React.js Developer
We are seeking a talented React.js Developer to build and maintain high-quality web applications. The ideal candidate should have strong experience in JavaScript/TypeScript, React.js, HTML, CSS, and state management (Redux/Context API). You will work closely with our team to develop responsive UIs, integrate APIs, and ensure performance optimization.
Requirements:
- 2+ years of experience in React.js development
- Strong knowledge of JavaScript (ES6+), React hooks, and component-based architecture
- Familiarity with RESTful APIs, Git, and modern front-end tools
- Bonus: Experience with Next.js, Tailwind, or testing frameworks