50+ SQL Jobs in India
Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Full Stack Developer (Node.js & React)
Location: Pune, India (Local or Ready to Relocate)
Employment Type: 6–8 Month Contract (Potential Conversion to FTE Based on Performance)
About the Role
We are seeking a highly skilled Full Stack Developer with expertise in Node.js and React to join our dynamic team in Pune. This role involves designing, developing, and deploying scalable web applications. You will collaborate with cross-functional teams to deliver high-impact solutions while adhering to best practices in coding, testing, and security.
Key Responsibilities
- Develop and maintain server-side applications using Node.js (Express/NestJS) and client-side interfaces with React.js (Redux/Hooks).
- Architect RESTful APIs and integrate with databases (SQL/NoSQL) and third-party services.
- Implement responsive UI/UX designs with modern front-end libraries (e.g., Material-UI, Tailwind CSS).
- Write unit/integration tests (Jest, Mocha, React Testing Library) and ensure code quality via CI/CD pipelines.
- Collaborate with product managers, designers, and QA engineers in an Agile environment.
- Troubleshoot performance bottlenecks and optimize applications for scalability.
- Document technical specifications and deployment processes.
Required Skills & Qualifications
- Experience: 3+ years in full-stack development with Node.js and React.
- Backend Proficiency:
- Strong knowledge of Node.js, Express, or NestJS.
- Experience with databases (PostgreSQL, MongoDB, Redis).
- API design (REST/GraphQL) and authentication (JWT/OAuth).
- Frontend Proficiency:
- Expertise in React.js (Functional Components, Hooks, Context API).
- State management (Redux, Zustand) and modern CSS frameworks.
- DevOps & Tools:
- Git, Docker, AWS/Azure, and CI/CD tools (Jenkins/GitHub Actions).
- Testing frameworks (Jest, Cypress, Mocha).
- Soft Skills:
- Problem-solving mindset and ability to work in a fast-paced environment.
- Excellent communication and collaboration skills.
- Location: Based in Pune or willing to relocate immediately.
Preferred Qualifications
- Experience with TypeScript, Next.js, or serverless architectures.
- Knowledge of microservices, message brokers (Kafka/RabbitMQ), or container orchestration (Kubernetes).
- Familiarity with Agile/Scrum methodologies.
- Contributions to open-source projects or a strong GitHub portfolio.
What We Offer
- Competitive Contract Compensation with timely payouts.
- Potential for FTE Conversion: Performance-based path to a full-time role.
- Hybrid Work Model: Flexible in-office (Pune) and remote options.
- Learning Opportunities: Access to cutting-edge tools and mentorship.
- Collaborative Environment: Work with industry experts on innovative projects.
Apply Now!
Ready to make an impact? Send your resume and GitHub/Portfolio links with the subject line:
"Full Stack Developer (Node/React) - Pune".
Local candidates or those relocating to Pune will be prioritized. Applications without portfolios will not be considered.
Equal Opportunity Employer
We celebrate diversity and are committed to creating an inclusive environment for all employees.
Roles:
- Experience working with Object Oriented Analysis, Design models and Design patterns.
- Good in translating User Cases and other System Requirements into a system design.
- Demonstrate excellent communication skills including the ability to effectively communicate with internal and external customers.
- Ability to use strong industry knowledge to relate to customer needs and dissolve customer concerns and high level of focus and attention to detail.
Preferred skills and qualification:
• 2+ years of Programming using PowerBuilder.
• Experience in PowerBuilder 2019 version is preferred.
• Experience in PFC is added advantage.
• Experience in programming using Oracle version 11G and SQL Server 2016
• Experience working in an Agile environment.
• High level of proficiency in MS Office including Word, Excel, Visio, and PowerPoint
About Fundly
Fundly is building a retailer-centric Pharma Digital Supply Chain Finance platform and marketplace for over 10 million pharma retailers in India.
- Founded by experienced industry professionals with a cumulative experience of 30+ years
- Grown to 100+ people across 20 cities in less than 3 years
- AUM of INR 100+ crores
- Raised venture capital of USD 5M so far
- Fast-growing: 3000+ retailers, 36,000 transactions, and ₹200+ crore disbursed in the last 2 years
- Technology-first and customer-first fintech organization
Opportunity at Fundly
- Be an early team member
- Gain visibility and influence the product and technology roadmap
Responsibilities
- Understand business requirements, customer persona, and product/applications
- Plan and execute test strategy for different projects
- Create test documentation, test plans, and test cases
- Reporting, providing feedback, and suggesting improvements
- Collaborate with other stakeholders
Qualifications
- 3+ years of hands-on experience in QA processes, test planning, and execution
- Hands-on experience in SQL and NoSQL databases like MySQL and Postgres
Who You Are
- Love to understand and solve problems—be it technology, business, or people-related
- Like to take responsibility and accountability
- Have worked in fast-paced environments and are willing to break standard benchmarks of performance
- Hands-on experience with STLC and automation testing
AccioJob is conducting a Walk-In Hiring Drive with StatusNeo for the position of Java Developer (Female candidates only).
To apply, register and select your slot here: https://go.acciojob.com/KAPWBu
Required Skills: Java, Spring Boot, SQL
Eligibility:
- Degree: BTech./BE, MCA
- Branch: Computer Science/CSE/Other CS related branch, IT, Electrical/Other electrical related branches
- Graduation Year: 2023, 2024, 2025
Work Details:
- Work Location: Gurugram (Onsite)
- CTC: ₹7 LPA to ₹10 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Gurugram Office
Further Rounds (for shortlisted candidates only):
Profile & Background Screening Round, Technical Interview Round 1, Technical Interview Round 2, Tech + Managerial Round
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/KAPWBu
Or apply faster with our newly launched app-https://go.acciojob.com/8UKhBL
AccioJob is conducting a Walk-In Hiring Drive with FloBiz for the position of Backend Intern.
To apply, register and select your slot here: https://go.acciojob.com/dkfKBz
Required Skills: SQL, RestAPI, OOPs, DSA
Eligibility:
- Degree: BTech./BE, BCA, BSc.
- Branch: Computer Science/CSE/Other CS related branch, IT
- Graduation Year: 2025, 2026
Work Details:
- Work Location: (Remote)
- CTC: ₹12 LPA to ₹15 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Skill Centres Noida, Pune, Chennai, Hyderabad, Bangalore
Further Rounds (for shortlisted candidates only):
Profile & Background Screening Round, Technical Interview Round 1, Technical Interview Round 2, Cultural Fit Round
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/dkfKBz
Or apply in seconds — straight from our brand-new app!
https://go.acciojob.com/L6rH7C
🌐 Job Title: Senior Azure Developer
🏢 Department: Digital Engineering
📍 Location: Pune (Work from Office)
📄 Job Type: Full-time
💼 Experience Required: 5+ years
💰 Compensation: Best in the industry
🔧 Roles & Responsibilities:
- Design, develop, and implement solutions using Microsoft Azure with .NET and other technologies.
- Collaborate with business analysts and end users to define system requirements.
- Work with QA teams to ensure solution integrity and functionality.
- Communicate frequently with stakeholders and team members to track progress and validate requirements.
- Evaluate and present technical solutions and recommendations.
- Provide technical mentoring and training to peers and junior developers.
💡 Technical Requirements:
- Minimum 2 years of hands-on development experience in:
- Azure Logic Apps
- Azure Service Bus
- Azure Web/API Apps
- Azure Functions
- Azure SQL Database / Cosmos DB
- Minimum 2 years’ experience in enterprise software development using .NET stack:
- REST APIs
- Web Applications
- Distributed Systems
- Familiarity with security best practices (e.g., OWASP).
- Knowledge of NoSQL data stores is an added advantage.
We are hiring a skilled Backend Developer to design and manage server-side applications, APIs, and database systems.
Key Responsibilities:
- Develop and manage APIs with Node.js and Express.js.
- Work with MongoDB and Mongoose for database management.
- Implement secure authentication using JWT.
- Optimize backend systems for performance and scalability.
- Deploy backend services on VPS and manage servers.
- Collaborate with frontend teams and use Git/GitHub for version control.
Required Skills:
- Node.js, Express.js
- MongoDB, Mongoose
- REST API, JWT
- Git, GitHub, VPS hosting
Qualifications:
- Bachelor’s degree in Computer Science or related field.
- Strong portfolio or GitHub profile preferred.
Job Title : Senior Data Engineer
Experience : 6 to 10 Years
Location : Gurgaon (Hybrid – 3 days office / 2 days WFH)
Notice Period : Immediate to 30 days (Buyout option available)
About the Role :
We are looking for an experienced Senior Data Engineer to join our Digital IT team in Gurgaon.
This role involves building scalable data pipelines, managing data architecture, and ensuring smooth data flow across the organization while maintaining high standards of security and compliance.
Mandatory Skills :
Azure Data Factory (ADF), Azure Cloud Services, SQL, Data Modelling, CI/CD tools, Git, Data Governance, RDBMS & NoSQL databases (e.g., SQL Server, PostgreSQL, Redis, ElasticSearch), Data Lake migration.
Key Responsibilities :
- Design and develop secure, scalable end-to-end data pipelines using Azure Data Factory (ADF) and Azure services.
- Build and optimize data architectures (including Medallion Architecture).
- Collaborate with cross-functional teams on cybersecurity, data privacy (e.g., GDPR), and governance.
- Manage structured/unstructured data migration to Data Lake.
- Ensure CI/CD integration for data workflows and version control using Git.
- Identify and integrate data sources (internal/external) in line with business needs.
- Proactively highlight gaps and risks related to data compliance and integrity.
Required Skills :
- Azure Data Factory (ADF) – Mandatory
- Strong SQL and Data Modelling expertise.
- Hands-on with Azure Cloud Services and data architecture.
- Experience with CI/CD tools and version control (Git).
- Good understanding of Data Governance practices.
- Exposure to ETL/ELT pipelines and Data Lake migration.
- Working knowledge of RDBMS and NoSQL databases (e.g., SQL Server, PostgreSQL, Redis, ElasticSearch).
- Understanding of RESTful APIs, deployment on cloud/on-prem infrastructure.
- Strong problem-solving, communication, and collaboration skills.
Additional Info :
- Work Mode : Hybrid (No remote); relocation to Gurgaon required for non-NCR candidates.
- Communication : Above-average verbal and written English skills.
Perks & Benefits :
- 5 Days work week
- Global exposure and leadership collaboration.
- Health insurance, employee-friendly policies, training and development.
Job Title: Backend Developer
Location: In-Office, Bangalore, Karnataka, India
Job Summary:
We are seeking a highly skilled and experienced Backend Developer with a minimum of 1 year of experience in product building to join our dynamic and innovative team. In this role, you will be responsible for designing, developing, and maintaining robust backend systems that drive our applications. You will collaborate with cross-functional teams to ensure seamless integration between frontend and backend components, and your expertise will be critical in architecting scalable, secure, and high-performance backend solutions.
Annual Compensation: 6-10 LPA
Responsibilities:
- Design, develop, and maintain scalable and efficient backend systems and APIs using NodeJS.
- Architect and implement complex backend solutions, ensuring high availability and performance.
- Collaborate with product managers, frontend developers, and other stakeholders to deliver comprehensive end-to-end solutions.
- Design and optimize data storage solutions using relational databases (e.g., MySQL) and NoSQL databases (e.g., MongoDB, Redis).
- Promoting a culture of collaboration, knowledge sharing, and continuous improvement.
- Implement and enforce best practices for code quality, security, and performance optimization.
- Develop and maintain CI/CD pipelines to automate build, test, and deployment processes.
- Ensure comprehensive test coverage, including unit testing, and implement various testing methodologies and tools to validate application functionality.
- Utilize cloud services (e.g., AWS, Azure, GCP) for infrastructure deployment, management, and optimization.
- Conduct system design reviews and contribute to architectural discussions.
- Stay updated with industry trends and emerging technologies to drive innovation within the team.
- Implement secure authentication and authorization mechanisms and ensure data encryption for sensitive information.
- Design and develop event-driven applications utilizing serverless computing principles to enhance scalability and efficiency.
Requirements:
- Minimum of 1 year of proven experience as a Backend Developer, with a strong portfolio of product-building projects.
- Extensive experience with JavaScript backend frameworks (e.g., Express, Socket) and a deep understanding of their ecosystems.
- Strong expertise in SQL and NoSQL databases (MySQL and MongoDB) with a focus on data modeling and scalability.
- Practical experience with Redis and caching mechanisms to enhance application performance.
- Proficient in RESTful API design and development, with a strong understanding of API security best practices.
- In-depth knowledge of asynchronous programming and event-driven architecture.
- Familiarity with the entire web stack, including protocols, web server optimization techniques, and performance tuning.
- Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes) is highly desirable.
- Proven experience working with cloud technologies (AWS/GCP/Azure) and understanding of cloud architecture principles.
- Strong understanding of fundamental design principles behind scalable applications and microservices architecture.
- Excellent problem-solving, analytical, and communication skills.
- Ability to work collaboratively in a fast-paced, agile environment and lead projects to successful completion.
How to Apply
Visit: https://www.thealteroffice.com/about
Position Overview: We are looking for an experienced and highly skilled Senior Data Engineer to join our team and help design, implement, and optimize data systems that support high-end analytical solutions for our clients. As a customer-centric Data Engineer, you will work closely with clients to understand their business needs and translate them into robust, scalable, and efficient technical solutions. You will be responsible for end-to-end data modelling, integration workflows, and data transformation processes while ensuring security, privacy, and compliance.In this role, you will also leverage the latest advancements in artificial intelligence, machine learning, and large language models (LLMs) to deliver high-impact solutions that drive business success. The ideal candidate will have a deep understanding of data infrastructure, optimization techniques, and cost-effective data management
Key Responsibilities:
• Customer Collaboration:
– Partner with clients to gather and understand their business
requirements, translating them into actionable technical specifications.
– Act as the primary technical consultant to guide clients through data challenges and deliver tailored solutions that drive value.
•Data Modeling & Integration:
– Design and implement scalable, efficient, and optimized data models to support business operations and analytical needs.
– Develop and maintain data integration workflows to seamlessly extract, transform, and load (ETL) data from various sources into data repositories.
– Ensure smooth integration between multiple data sources and platforms, including cloud and on-premise systems
• Data Processing & Optimization:
– Develop, optimize, and manage data processing pipelines to enable real-time and batch data processing at scale.
– Continuously evaluate and improve data processing performance, optimizing for throughput while minimizing infrastructure costs.
• Data Governance & Security:
–Implement and enforce data governance policies and best practices, ensuring data security, privacy, and compliance with relevant industry regulations (e.g., GDPR, HIPAA).
–Collaborate with security teams to safeguard sensitive data and maintain privacy controls across data environments.
• Cross-Functional Collaboration:
– Work closely with data engineers, data scientists, and business
analysts to ensure that the data architecture aligns with organizational objectives and delivers actionable insights.
– Foster collaboration across teams to streamline data workflows and optimize solution delivery.
• Leveraging Advanced Technologies:
– Utilize AI, machine learning models, and large language models (LLMs) to automate processes, accelerate delivery, and provide
smart, data-driven solutions to business challenges.
– Identify opportunities to apply cutting-edge technologies to improve the efficiency, speed, and quality of data processing and analytics.
• Cost Optimization:
–Proactively manage infrastructure and cloud resources to optimize throughput while minimizing operational costs.
–Make data-driven recommendations to reduce infrastructure overhead and increase efficiency without sacrificing performance.
Qualifications:
• Experience:
– Proven experience (5+ years) as a Data Engineer or similar role, designing and implementing data solutions at scale.
– Strong expertise in data modelling, data integration (ETL), and data transformation processes.
– Experience with cloud platforms (AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark).
• Technical Skills:
– Advanced proficiency in SQL, data modelling tools (e.g., Erwin,PowerDesigner), and data integration frameworks (e.g., Apache
NiFi, Talend).
– Strong understanding of data security protocols, privacy regulations, and compliance requirements.
– Experience with data storage solutions (e.g., data lakes, data warehouses, NoSQL, relational databases).
• AI & Machine Learning Exposure:
– Familiarity with leveraging AI and machine learning technologies (e.g., TensorFlow, PyTorch, scikit-learn) to optimize data processing and analytical tasks.
–Ability to apply advanced algorithms and automation techniques to improve business processes.
• Soft Skills:
– Excellent communication skills to collaborate with clients, stakeholders, and cross-functional teams.
– Strong problem-solving ability with a customer-centric approach to solution design.
– Ability to translate complex technical concepts into clear, understandable terms for non-technical audiences.
• Education:
– Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or a related field (or equivalent practical experience).
LIFE AT FOUNTANE:
- Fountane offers an environment where all members are supported, challenged, recognized & given opportunities to grow to their fullest potential.
- Competitive pay
- Health insurance for spouses, kids, and parents.
- PF/ESI or equivalent
- Individual/team bonuses
- Employee stock ownership plan
- Fun/challenging variety of projects/industries
- Flexible workplace policy - remote/physical
- Flat organization - no micromanagement
- Individual contribution - set your deadlines
- Above all - culture that helps you grow exponentially!
A LITTLE BIT ABOUT THE COMPANY:
Established in 2017, Fountane Inc is a Ventures Lab incubating and investing in new competitive technology businesses from scratch. Thus far, we’ve created half a dozen multi-million valuation companies in the US and a handful of sister ventures for large corporations, including Target, US Ventures, and Imprint Engine.
We’re a team of 120+ strong from around the world that are radically open-minded and believes in excellence, respecting one another, and pushing our boundaries to the furthest it's ever been.
We are looking for a dedicated and proactive QA Engineer with 3–4 years of experience in both manual and automation testing. The ideal candidate should be detail-oriented and ready to take ownership of the product’s quality by actively contributing to testing strategies, writing automation scripts, and collaborating closely with cross-functional teams.
This role is ideal for someone who is strong in manual testing and has experience writing automation test scripts, even if they are still building their understanding of complete automation frameworks. As part of their growth, we expect candidates to develop framework knowledge to eventually automate full test suites.
Key Responsibilities:
- Design and execute manual test cases based on requirements, user stories, and acceptance criteria.
- Write automation test scripts for UI/API layers using tools and languages like Selenium, Java, Python, or JavaScript (any framework knowledge is welcome).
- Collaborate with developers, product owners, and QA team members during Agile sprints.
- Prepare and maintain test documentation, including test cases, test reports, and checklists.
- Track and manage bugs and issues using tools such as JIRA or ClickUp.
- Perform API testing using Postman, REST clients, or similar tools.
- Participate in release processes, including preparing release notes and verifying successful deployments.
- Take ownership of testing and be willing to drive the QA process end-to-end, ensuring product quality at every stage.
- Perform cross-browser and platform testing, data validation, and backend checks using SQL.
- Contribute to the long-term improvement of QA processes and test coverage.
Required Skills and Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- 3–4 years of hands-on experience in manual testing and writing automation scripts.
- Strong understanding of STLC, SDLC, Agile methodology, and the defect life cycle.
- Basic to intermediate knowledge of automation tools such as Selenium or similar; ability to write automation scripts.
- Willingness to learn and gain experience in automation frameworks (e.g., TestNG, JUnit, BDD frameworks, etc.).
- Familiarity with API testing tools like Postman or REST Assured.
- Good knowledge of SQL for database-level testing.
- Experience with bug tracking and test management tools like JIRA and ClickUp.
- Excellent communication skills, strong analytical and problem-solving abilities.
- Ability to own the product's quality and work proactively with minimal supervision.
Nice to Have:
- Exposure to CI/CD tools (e.g., Jenkins, GitHub Actions).
- Experience with version control systems like Git.
- Mobile testing experience is a plus.
Who Should Apply
- 3–4 years of experience in manual testing and writing automation scripts
- Familiar with Selenium or similar tools; eager to learn test frameworks
- Strong in API testing (Postman) and SQL for backend validation
- Hands-on with JIRA, ClickUp, or similar tools
- Willing to own product quality and drive testing independently
- Based in Hyderabad or open to relocating (on-site role)
Perks & Benefits
- Competitive salary & performance bonuses
- Direct ownership of features and real startup exposure
- Letter of Recommendation
Note: The final CTC will be decided based on the candidate's skills, experience, and performance during the interview process. We are open to rewarding talent that brings value, ownership, and a growth mindset to the team.
Job Title: Lead Java Developer
Location: Bangalore
Experience: 8-12 years (only)
Job Overview:
We are looking for a highly experienced Java / Backend Developer with a strong background in investment banking/trading/brokerages to join our team. The ideal candidate will have extensive experience in developing front office and middle office systems which require high availability, high reliability and low latency/high throughput performance profiles.
Skills & Qualifications:
- 8 to 12 years of experience in Java development, with at least 4 years in the investment banking/financial services domain.
- Strong knowledge of Java, Spring Framework, RESTful web services.
- Hands-on experience with AWS Cloud and related services. Knowledge of Kubernetes is a plus.
- Solid experience with Kafka and/or messaging protocols.
- Familiarity with SQL databases (e.g., Postgres/Oracle).
- Strong problem-solving skills and ability to work in a team.
- Ability to understand and work with distributed systems / microservices architecture.
- Solid written and verbal communication skills to interface with other technology and business stakeholders.
Background
Fisdom is a leading digital wealth management platform. Fisdom platform (mobile apps and web apps) provides access to consumers to a wide bouquet of financial solutions – investments, savings and protection (and many more in the pipeline). Fisdom blends cutting-edge technology with conventional financial wisdom, awesome UX and friendly customer service to make financial products simpler and more accessible to millions of Indians. We are growing and constantly looking for high performers to participate in our growth story. We have recently been certified as the great place to work. For more info, visit www.fisdom.com.
Objectives of this Role
Improve, execute, and effectively communicate significant analyses that identify opportunities across the business
Participate in meetings with management, assessing and addressing issues to identify and implement toward operations
Provide strong and timely financial and business analytic decision support to various organizational stakeholders
Responsibilities
Interpret data, analyze results using analytics, research methodologies, and statistical techniques
Develop and implement data analyses, leverage data collection and other strategies that statistical efficiency and quality
Prepare summarize various weekly, monthly, and periodic results for use by various key stakeholders
Conduct full lifecycle of analytics projects, including pulling, manipulating, and exporting data from project requirements documentation to design and execution
Evaluate key performance indicators, provide ongoing reports, and recommend business plan updates
Skills and Qualification
Bachelor’s degree, preferably in computer science, mathematics, or economics
Advanced analytical skills with experience collecting, organizing, analyzing, and disseminating information with accuracy
The ability to present findings in a polished way
Proficiency with statistics and dataset analytics (using SQL, Python, Excel)
Entrepreneurial mindset, with an innovative approach to business planning
Relevant Industry Experience of more than 2 - 6years, more than 4yrs python experience is must
Preferable- Product startup's, fintech
Why join us and where?
We have an excellent work culture and an opportunity to be a part of a growing organization with immense learning possibilities. You have an opportunity to build a brand from the scratch. All of this, along with top of the line remuneration and challenging work. You will based out of Bangalore.
About Us:
At Vahan, we are building the first AI powered recruitment marketplace for India’s 300 million strong Blue Collar workforce, opening doors to economic opportunities and brighter futures.
Already India’s largest recruitment platform, Vahan is supported by marquee investors like Khosla Ventures, Y Combinator, Airtel, Vijay Shekhar Sharma (CEO, Paytm), and leading executives from Google and Facebook.
Our customers include names like Swiggy, Zomato, Rapido, Zepto, and many more. We leverage cutting-edge technology and AI to recruit for the workforces of some of the most recognized companies in the country.
Our vision is ambitious: to become the go-to platform for blue-collar professionals worldwide, empowering them with not just earning opportunities but also the tools, benefits, and support they need to thrive. We aim to impact over a billion lives worldwide, creating a future where everyone has access to economic prosperity.
If our vision excites you, Vahan might just be your next adventure. We’re on the hunt for driven individuals who love tackling big challenges. If this sounds like your kind of journey, dive into the details and see where you can make your mark.
What You Will Be Doing:
- Build & Automate Cloud Infrastructure – Design, deploy, and optimize cloud environments, ensuring scalability, reliability, and cost efficiency.
- Set Up CI/CD & Deployment Pipelines – Develop automated workflows to streamline code integration, testing, and deployment for faster releases.
- Monitor & Improve System Performance – Implement robust monitoring, logging, and alerting mechanisms to proactively identify and resolve issues.
- Manage Containers & Scalability – Deploy and maintain containerized applications, ensuring efficient resource utilization and high availability.
- Ensure Security & Reliability – Enforce access controls, backup strategies, and disaster recovery plans to safeguard infrastructure and data.
- Adapt & Scale with the Startup – Take on dynamic responsibilities, quickly learn new technologies, and evolve processes to meet growing business needs.
You Will Thrive in This Role If You:
Must Haves:
- Experience: 3+ years in DevOps or related roles, focusing on cloud environments, automation, CI/CD, and Linux system administration. Strong expertise in debugging and infrastructure performance improvements.
- Cloud Expertise: In-depth experience with one or more cloud platforms (AWS, GCP), including services like EC2, RDS, S3, VPC, etc.
- IaC Tools: Proficiency in Terraform, Ansible, CloudFormation, or similar tools.
- Scripting Skills: Strong scripting abilities in Python, Bash, or PowerShell.
- Containerization: Experience with Docker, including managing containers in production.
- Monitoring Tools: Hands-on experience with tools like ELK, Prometheus, Grafana, CloudWatch, New Relic, and Data dog.
- Version Control: Proficiency with Git and code repository management.
- Soft Skills: Excellent problem-solving skills, attention to detail, and effective communication with both technical and non-technical team members.
- Database Management: Experience with managing and tuning databases like MySQL and PostgreSQL.
- Deployment Pipelines: Experience with Jenkins and similar CI/CD tools.
- Message Queues: Experience with rabbitMQ/SQS/Kafka.
Nice to Have:
- Certifications: AWS Certified DevOps Engineer, Certified Kubernetes Administrator (CKA), or similar.
- SRE Practices: Familiarity with Site Reliability Engineering (SRE) principles, including error budgeting and service level objectives (SLOs).
- Serverless Computing: Knowledge of AWS Lambda, Azure Functions, or similar architectures.
- Containerization: Experience with Docker and Kubernetes, including managing production clusters.
- Security: Awareness of security best practices and implementations.
- Cloud Cost Optimization: Experience with cost-saving initiatives in cloud environments.
- Data Pipelines & ETL: Experience in setting up and managing data pipelines and ETL workflows.
- Familiarity with Modern Tech Stacks: Exposure to Python, Node.js, React.js, and Kotlin for app deployment CI/CD pipelines.
- MLOps Pipelines: Understanding of ML model deployment and operationalization.
- Data Retrieval & Snapshots: Experience with PITI, EC2, and RDS snapshots.
- System Resiliency & Recovery: Strategies for ensuring system reliability and recovery in case of downtime.
At Vahan, you’ll have the opportunity to make a real impact in a sector that touches millions of lives. We’re committed to not only advancing the livelihoods of our workforce but also, in taking care of the people who make this mission possible. Here’s what we offer:
- Unlimited PTO: Trust and flexibility to manage your time in the way that works best for you.
- Comprehensive Medical Insurance: We’ve got you covered with plans designed to support you and your loved ones.
- Monthly Wellness Leaves: Regular time off to recharge and focus on what matters most.
- Competitive Pay: Your contributions are recognized and rewarded with a compensation package that reflects your impact.
Join us, and be part of something bigger—where your work drives real,
positive change in the world.
Job Title: PostgreSQL Database Administrator
Experience: 6–8 Years
Work Mode: Hybrid
Locations: Hyderabad / Pune
Joiners: Only immediate joiners & candidates who have completed notice period
Required Skills
- Strong hands-on experience in PostgreSQL administration (6+ years).
- Excellent understanding of SQL and query optimization techniques.
- Deep knowledge of database services, architecture, and internals.
- Experience in performance tuning at both DB and OS levels.
- Familiarity with DataGuard or similar high-availability solutions.
- Strong experience in job scheduling and automation.
- Comfortable with installing, configuring, and upgrading PostgreSQL.
- Basic to intermediate knowledge of Linux system administration.
- Hands-on experience with shell scripting for automation and monitoring tasks.
Key Responsibilities
- Administer and maintain PostgreSQL databases with 6+ years of hands-on experience.
- Write and optimize complex SQL queries for performance and scalability.
- Manage database storage structures and ensure optimal disk usage and performance.
- Monitor, analyze, and resolve database performance issues using tools and logs.
- Perform database tuning, configuration adjustments, and query optimization.
- Plan, schedule, and automate jobs using cron or other job scheduling tools at DB and OS levels.
- Install and upgrade PostgreSQL database software to new versions as required.
- Manage high availability and disaster recovery setups, including replication and DataGuard administration (or equivalent techniques).
- Perform regular database backups and restorations to ensure data integrity and availability.
- Apply security patches and updates on time.
- Collaborate with developers for schema design, stored procedures, and access privileges.
- Document configurations, processes, and performance tuning results.
Experience in working with various ML libraries and packages like Scikit learn, Numpy, Pandas, Tensorflow, Matplotlib, Caffe, etc. 2. Deep Learning Frameworks: PyTorch, spaCy, Keras 3. Deep Learning Architectures: LSTM, CNN, Self-Attention and Transformers 4. Experience in working with Image processing, computer vision is must 5. Designing data science applications, Large Language Models(LLM) , Generative Pre-trained Transformers (GPT), generative AI techniques, Natural Language Processing (NLP), machine learning techniques, Python, Jupyter Notebook, common data science packages (tensorflow, scikit-learn,keras etc.,.) , LangChain, Flask, FastAPI, prompt engineering. 6. Programming experience in Python 7. Strong written and verbal communications 8. Excellent interpersonal and collaboration skills.
Good-to-Have 1. Experience working with vectored databases and graph representation of documents. 2. Experience with building or maintaining MLOps pipelines. 3. Experience in Cloud computing infrastructures like AWS Sagemaker or Azure ML for implementing ML solutions is preferred. 4. Exposure to Docker, Kubernetes
SN Role descriptions / Expectations from the Role
1 Design and implement scalable and efficient data architectures to support generative AI workflows.
2 Fine tune and optimize large language models (LLM) for generative AI, conduct performance evaluation and benchmarking for LLMs and machine learning models
3 Apply prompt engineer techniques as required by the use case
4 Collaborate with research and development teams to build large language models for generative AI use cases, plan and breakdown of larger data science tasks to lower-level tasks
5 Lead junior data engineers on tasks such as design data pipelines, dataset creation, and deployment, use data visualization tools, machine learning techniques, natural language processing , feature engineering, deep learning , statistical modelling as required by the use case.
Mumbai malad work from office
6 Days working
1 & 3 Saturday off
AWS Expertise: Minimum 2 years of experience working with AWS services like RDS, S3, EC2, and Lambda.
Roles and Responsibilities
1. Backend Development: Develop scalable and high-performance APIs and backend systems using Node.js. Write clean, modular, and reusable code following best practices. Debug, test, and optimize backend services for performance and scalability.
2. Database Management: Design and maintain relational databases using MySQL, PostgreSQL, or AWS RDS. Optimize database queries and ensure data integrity. Implement data backup and recovery plans.
3. AWS Cloud Services: Deploy, manage, and monitor applications using AWS infrastructure. Work with AWS services including RDS, S3, EC2, Lambda, API Gateway, and CloudWatch. Implement security best practices for AWS environments (IAM policies, encryption, etc.).
4. Integration and Microservices:Integrate third-party APIs and services. Develop and manage microservices architecture for modular application development.
5. Version Control and Collaboration: Use Git for code versioning and maintain repositories. Collaborate with front-end developers and project managers for end-to-end project delivery.
6. Troubleshooting and Debugging: Analyze and resolve technical issues and bugs. Provide maintenance and support for existing backend systems.
7. DevOps and CI/CD: Set up and maintain CI/CD pipelines. Automate deployment processes and ensure zero-downtime releases.
8. Agile Development:
Participate in Agile/Scrum ceremonies such as daily stand-ups, sprint planning, and retrospectives.
Deliver tasks within defined timelines while maintaining high quality.
Required Skills
Strong proficiency in Node.js and JavaScript/TypeScript.
Expertise in working with relational databases like MySQL/PostgreSQL and AWS RDS.
Proficient with AWS services including Lambda, S3, EC2, and API Gateway.
Experience with RESTful API design and GraphQL (optional).
Knowledge of containerization using Docker is a plus.
Strong problem-solving and debugging skills.
Familiarity with tools like Git, Jenkins, and Jira.
Job Description : Quantitative R&D Engineer
As a Quantitative R&D Engineer, you’ll explore data and design logic that becomes live trading strategies. You’ll bridge the gap between raw research and deployed, autonomous capital systems.
What You’ll Work On
- Analyze on-chain and market data to identify inefficiencies and behavioral patterns.
- Develop and prototype systematic trading strategies using statistical and ML-based techniques.
- Contribute to signal research, backtesting infrastructure, and strategy evaluation frameworks.
- Monitor and interpret DeFi protocol mechanics (AMMs, perps, lending markets) for alpha generation.
- Collaborate with engineers to turn research into production-grade, automated trading systems.
Ideal Traits
- Strong in data structures, algorithms, and core CS fundamentals.
- Proficiency in any programming language
- Understanding of probability, statistics, or ML concepts.
- Self-driven and comfortable with ambiguity, iteration, and fast learning cycles.
- Strong interest in markets, trading, or algorithmic systems.
Bonus Points For
- Experience with backtesting or feature engineering.
- Exposure to crypto primitives (AMMs, perps, mempools, etc.)
- Projects involving alpha signals, strategy testing, or DeFi bots.
- Participation in quant contests, hackathons, or open-source work.
What You’ll Gain:
- Cutting-Edge Tech Stack: You'll work on modern infrastructure and stay up to date with the latest trends in technology.
- Idea-Driven Culture: We welcome and encourage fresh ideas. Your input is valued, and you're empowered to make an impact from day one.
- Ownership & Autonomy: You’ll have end-to-end ownership of projects. We trust our team and give them the freedom to make meaningful decisions.
- Impact-Focused: Your work won’t be buried under bureaucracy. You’ll see it go live and make a difference in days, not quarters
What We Value:
- Craftsmanship over shortcuts: We appreciate engineers who take the time to understand the problem deeply and build durable solutions—not just quick fixes.
- Depth over haste: If you're the kind of person who enjoys going one level deeper to really "get" how something works, you'll thrive here.
- Invested mindset: We're looking for people who don't just punch tickets, but care about the long-term success of the systems they build.
- Curiosity with follow-through: We admire those who take the time to explore and validate new ideas, not just skim the surface.
Compensation:
- INR 6 - 12 LPA
- Performance Bonuses: Linked to contribution, delivery, and impact.
JOB REQUIREMENT:
Wissen Technology is now hiring a Azure Data Engineer with 7+ years of relevant experience.
We are solving complex technical problems in the financial industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team, which has made a mark as a high-end technical consultant.
Required Skills:
· 6+ years of being a practitioner in data engineering or a related field.
· Proficiency in programming skills in Python
· Experience with data processing frameworks like Apache Spark or Hadoop.
· Experience working on Snowflake and Databricks.
· Familiarity with cloud platforms (AWS, Azure) and their data services.
· Experience with data warehousing concepts and technologies.
· Experience with message queues and streaming platforms (e.g., Kafka).
· Excellent communication and collaboration skills.
· Ability to work independently and as part of a geographically distributed team.
We are looking for a passionate and experienced Business Analyst Trainer to join our training team. This role involves delivering high-quality training programs on business analysis tools, methodologies, and best practices, both in-person and online.
Senior Software Engineer – Java
Location: Pune (Hybrid – 3 days from office)
Experience: 8–15 Years
Domain: Information Technology (IT)
Joining: Immediate joiners only
Preference: Local candidates only (Pune-based)
Job Description:
We are looking for experienced and passionate Senior Java Engineers to join a high-performing development team. The role involves building and maintaining robust, scalable, and low-latency backend systems and microservices in a fast-paced, agile environment.
Key Responsibilities:
- Work within a high-velocity scrum team to deliver enterprise-grade software solutions.
- Architect and develop scalable end-to-end web applications and microservices.
- Collaborate with cross-functional teams to analyze requirements and deliver optimal technical solutions.
- Participate in code reviews, unit testing, and deployment.
- Mentor junior engineers while remaining hands-on with development tasks.
- Provide accurate estimates and support the team lead in facilitating development processes.
Mandatory Skills & Experience:
- 6–7+ years of enterprise-level Java development experience.
- Strong in Java 8 or higher (Java 11 preferred), including lambda expressions, Stream API, Completable Future.
- Minimum 4+ years working with Microservices, Spring Boot, and Hibernate.
- At least 3+ years of experience designing and developing RESTful APIs.
- Kafka – minimum 2 years’ hands-on experience in the current/most recent project.
- Solid experience with SQL.
- AWS – minimum 1.5 years of experience.
- Understanding of CI/CD pipelines and deployment processes.
- Exposure to asynchronous programming, multithreading, and performance tuning.
- Experience working in at least one Fintech domain project (mandatory).
Nice to Have:
- Exposure to Golang or Rust.
- Experience with any of the following tools: MongoDB, Jenkins, Sonar, Oracle DB, Drools, Adobe AEM, Elasticsearch/Solr/Algolia, Spark.
- Strong systems design and data modeling capabilities.
- Experience in payments or asset/wealth management domain.
- Familiarity with rules engines and CMS/search platforms.
Candidate Profile:
- Strong communication and client-facing skills.
- Proactive, self-driven, and collaborative mindset.
- Passionate about clean code and quality deliverables.
- Prior experience in building and deploying multiple products in production.
Note: Only candidates who are based in Pune and can join immediately will be considered.
Job Description :
We are seeking a highly experienced Sr Data Modeler / Solution Architect to join the Data Architecture team at Corporate Office in Bangalore. The ideal candidate will have 4 to 8 years of experience in data modeling and architecture with deep expertise in AWS cloud stack, data warehousing, and enterprise data modeling tools. This individual will be responsible for designing and creating enterprise-grade data models and driving the implementation of Layered Scalable Architecture or Medallion Architecture to support robust, scalable, and high-quality data marts across multiple business units.
This role will involve managing complex datasets from systems like PoS, ERP, CRM, and external sources, while optimizing performance and cost. You will also provide strategic leadership on data modeling standards, governance, and best practices, ensuring the foundation for analytics and reporting is solid and future-ready.
Key Responsibilities:
· Design and deliver conceptual, logical, and physical data models using tools like ERWin.
· Implement Layered Scalable Architecture / Medallion Architecture for building scalable, standardized data marts.
· Optimize performance and cost of AWS-based data infrastructure (Redshift, S3, Glue, Lambda, etc.).
· Collaborate with cross-functional teams (IT, business, analysts) to gather data requirements and ensure model alignment with KPIs and business logic.
· Develop and optimize SQL code, materialized views, stored procedures in AWS Redshift.
· Ensure data governance, lineage, and quality mechanisms are established across systems.
· Lead and mentor technical teams in an Agile project delivery model.
· Manage data layer creation and documentation: data dictionary, ER diagrams, purpose mapping.
· Identify data gaps and availability issues with respect to source systems.
Required Skills & Qualifications:
· Bachelor’s or Master’s degree in Computer Science, IT, or related field (B.E./B.Tech/M.E./M.Tech/MCA).
· Minimum 4 years of experience in data modeling and architecture.
· Proficiency with data modeling tools such as ERWin, with strong knowledge of forward and reverse engineering.
· Deep expertise in SQL (including advanced SQL, stored procedures, performance tuning).
· Strong experience in data warehousing, RDBMS, and ETL tools like AWS Glue, IBM DataStage, or SAP Data Services.
· Hands-on experience with AWS services: Redshift, S3, Glue, RDS, Lambda, Bedrock, and Q.
· Good understanding of reporting tools such as Tableau, Power BI, or AWS QuickSight.
· Exposure to DevOps/CI-CD pipelines, AI/ML, Gen AI, NLP, and polyglot programming is a plus.
· Familiarity with data governance tools (e.g., ORION/EIIG).
· Domain knowledge in Retail, Manufacturing, HR, or Finance preferred.
· Excellent written and verbal communication skills.
Certifications (Preferred):
· AWS Certification (e.g., AWS Certified Solutions Architect or Data Analytics – Specialty)
· Data Governance or Data Modeling Certifications (e.g., CDMP, Databricks, or TOGAF)
Mandatory Skills
aws, Technical Architecture, Aiml, SQL, Data Warehousing, Data Modelling
🚀 Hiring: Postgres DBA at Deqode
⭐ Experience: 6+ Years
📍 Location: Pune & Hyderabad
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Looking for an experienced Postgres DBA with:-
✅ 6+ years in Postgres & strong SQL skills
✅ Good understanding of database services & storage management
✅ Performance tuning & monitoring expertise
✅ Knowledge of Dataguard admin, backups, upgrades
✅ Basic Linux admin & shell scripting
We are seeking a detail-oriented and analytical Business Analyst to bridge the gap between business needs and technology solutions. The ideal candidate will be responsible for analyzing business processes, identifying improvement areas, and supporting data-driven decision-making through insights and documentation.
About Us
DAITA is a German AI startup revolutionizing the global textile supply chain by digitizing factory-to-brand workflows. We are building cutting-edge AI-powered SaaS and Agentic Systems that automate order management, production tracking, and compliance — making the supply chain smarter, faster, and more transparent.
Fresh off a $500K pre-seed raise, our passionate team is on the ground in India, collaborating directly with factories and brands to build our MVP and create real-world impact. If you’re excited by the intersection of AI, SaaS, and supply chain innovation, join us to help reshape how textiles move from factory floors to global brands.
Role Overview
We’re seeking a versatile Full-Stack Engineer to join our growing engineering team. You’ll be instrumental in designing and building scalable, secure, and high-performance applications that power our AI-driven platform. Working closely with Founders, ML Engineers, and Pilot Customers, you’ll transform complex AI workflows into intuitive, production-ready features.
What You’ll Do
• Design, develop, and deploy backend services, APIs, and microservices powering our platform.
• Build responsive, user-friendly frontend applications tailored for factory and brand users.
• Integrate AI/ML models and agentic workflows into seamless production environments.
• Develop features supporting order parsing, supply chain tracking, compliance, and reporting.
• Collaborate cross-functionally to iterate rapidly, test with users, and deliver impactful releases.
• Optimize applications for performance, scalability, and cost-efficiency on cloud platforms.
• Establish and improve CI/CD pipelines, deployment processes, and engineering best practices.
• Write clear documentation and maintain clean, maintainable code.
Required Skills
• 3–5 years of professional Full-Stack development experience
• Strong backend skills with frameworks like Node.js, Python (FastAPI, Django), Go, or similar
• Frontend experience with React, Vue.js, Next.js, or similar modern frameworks
• Solid knowledge and experience with relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Redis, Neon)
• Strong API design skills (REST mandatory; GraphQL a plus)
• Containerization expertise with Docker
• Container orchestration and management with Kubernetes (including experience with Helm charts, operators, or custom resource definitions)
• Cloud deployment and infrastructure experience on AWS, GCP or Azure
• Hands-on experience deploying AI/ML models in cloud-native environments (AWS, GCP or Azure) with scalable infrastructure and monitoring.
• Experience with managed AI/ML services like AWS SageMaker, GCP Vertex AI, Azure ML, Together.ai, or similar
• Experience with CI/CD pipelines and DevOps tools such as Jenkins, GitHub Actions, Terraform, Ansible, or ArgoCD
• Familiarity with monitoring, logging, and observability tools like Prometheus, Grafana, ELK stack (Elasticsearch, Logstash, Kibana), or Helicone
Nice-to-have
• Experience with TypeScript for full-stack AI SaaS development
• Use of modern UI frameworks and tooling like Tailwind CSS
• Familiarity with modern AI-first SaaS concepts viz. vector databases for fast ML data retrieval, prompt engineering for LLM integration, integrating with OpenRouter or similar LLM orchestration frameworks etc.
• Knowledge of MLOps tools like Kubeflow, MLflow, or Seldon for model lifecycle management.
• Background in building data pipelines, real-time analytics, and predictive modeling.
• Knowledge of AI-driven security tools and best practices for SaaS compliance.
• Proficiency in cloud automation, cost optimization, and DevOps for AI workflows.
• Ability to design and implement hyper-personalized, adaptive user experiences.
What We Value
• Ownership: You take full responsibility for your work and ship high-quality solutions quickly.
• Bias for Action: You’re pragmatic, proactive, and focused on delivering results.
• Clear Communication: You articulate ideas, challenges, and solutions effectively across teams.
• Collaborative Spirit: You thrive in a cross-functional, distributed team environment.
• Customer Focus: You build with empathy for end users and real-world usability.
• Curiosity & Adaptability: You embrace learning, experimentation, and pivoting when needed.
• Quality Mindset: You write clean, maintainable, and well-tested code.
Why Join DAITA?
• Be part of a mission-driven startup transforming a $1+ Trillion global industry.
• Work closely with founders and AI experts on cutting-edge technology.
• Directly impact real-world supply chains and sustainability.
• Grow your skills in AI, SaaS, and supply chain tech in a fast-paced environment.
Employment type- Contract basis
Key Responsibilities
- Design, develop, and maintain scalable data pipelines using PySpark and distributed computing frameworks.
- Implement ETL processes and integrate data from structured and unstructured sources into cloud data warehouses.
- Work across Azure or AWS cloud ecosystems to deploy and manage big data workflows.
- Optimize performance of SQL queries and develop stored procedures for data transformation and analytics.
- Collaborate with Data Scientists, Analysts, and Business teams to ensure reliable data availability and quality.
- Maintain documentation and implement best practices for data architecture, governance, and security.
⚙️ Required Skills
- Programming: Proficient in PySpark, Python, and SQL.
- Cloud Platforms: Hands-on experience with Azure Data Factory, Databricks, or AWS Glue/Redshift.
- Data Engineering Tools: Familiarity with Apache Spark, Kafka, Airflow, or similar tools.
- Data Warehousing: Strong knowledge of designing and working with data warehouses like Snowflake, BigQuery, Synapse, or Redshift.
- Data Modeling: Experience in dimensional modeling, star/snowflake schema, and data lake architecture.
- CI/CD & Version Control: Exposure to Git, Terraform, or other DevOps tools is a plus.
🧰 Preferred Qualifications
- Bachelor's or Master's in Computer Science, Engineering, or related field.
- Certifications in Azure/AWS are highly desirable.
- Knowledge of business intelligence tools (Power BI, Tableau) is a bonus.
· 3 to 5 years of full-stack development experience implementing applications using Python, React.js
· In-depth knowledge of Python – Data Analytics, NLP and Flask API
§ Experience working in SQL Databases (MySQL/Postgres – min. 2 years)
§ Ability to use Gen AI tools for Productivity.
§ Gen AI for Natural Language processing Use cases – Using ChatGPT 4/Gemini flash or other cutting-edge tools
§ Hands-on exposure in working with messaging systems like RabbitMQ
§ Experience in the end to end and unit testing frameworks (jest/cypress)
§ Experience working in NoSQL Databases like MongoDB
· Understanding differences between multiple delivery platforms, such as mobile vs. desktop, and optimizing output to match the specific platform.
§ Cloud architectural knowledge of Azure cloud.
§ Proficient understanding of code versioning tools, such as Git, SVN
§ Knowledge of CI/CD (Jenkins/Hudson)
§ Self-organizing & experience working in Agile/Scrum culture
Good to have.
§ Experience working in Angular, Elasticsearch and Redis
§ Understanding accessibility and security compliances
§ Understanding of UI/UX
Looking for a passionate developer and team player who wants to learn, contribute and bring
fun & energy to the team. We are a friendly startup where we provide opportunities to explore
and learn a lot of things(new technology/tools etc.,) in building quality products using
best-in-class technology.
Responsibilities
● Design and develop new features using Full-stack development
(Java/Spring/React/Angular/Mysql) for a cloud(AWS/others) and mobile product
application in SOA/microservices architecture.
● Design awesome features and continuously improve them by exploring alternatives /
technologies to make design improvements.
● Performance testing with Gatling (Scala).
● Work with CI/CD pipeline and tools (Docker, Ansible) to improve build and
deployment process.
● Working with QA to ensure the quality and timing of new release deployments.
Skills/Experience
Good coding/problem solving skills and interest in learning new things will be the key.
Time /Training will be provided to learn new technologies/tools.
● 4 or more years of professional experience in building web/mobile applications using
Java or similar technologies (C#, Ruby, Python, Elixir, NodeJS).
● Experience in Spring Framework or similar frameworks.
● Experience in any DB (SQL/noSQL)
● Any experience in front-end development using React/Vue/Angular/similar
frameworks.
● Any experience with Java/similar testing frameworks (JUnit/Mocks etc).
Greetings from Wissen Technology!
We are hiring Java Developers for Mumbai location.
Exp - 4 - 8 years
Location - Mumbai (Goregaon) - Hybrid
Notice Period - immediate or serving notice period only
Interview Process - Initial rounds virtual + Final round F2F
Software Engineer/Senior Software Engineer/Lead Engineer-Java
· Experience in Core Java 5.0 and above, Data Structures, OOPS, Multithreading, Algorithms, Collections, Unix/Linux
· Possess good architectural knowledge and be aware of enterprise application design patterns.
· Should have the ability to analyse, design, develop and test complex, low-latency client-facing applications.
· Good development experience with RDBMS
· Good knowledge of multi-threading and high volume server side development
Basic working knowledge of Unix/Linux
· Excellent problem solving and coding skills in Java
· Strong interpersonal, communication and analytical skills.
· Should have the ability to express their design ideas and thoughts.
Job Brief-
· Understand product requirements and come up with solution approaches
· Build and enhance large scale domain centric applications
· Deploy high quality deliverables into production adhering to the security, compliance and SDLC guidelines
Job Summary:
We are looking for an experienced Full Stack Developer with expertise in Angular 15+, PHP, Node.js and SQL databases. As a key member of our engineering team, you will design, develop, and maintain both the front-end and back-end of our applications. If you are passionate about building scalable, high-performance web solutions and have experience with cloud technologies, we encourage you to apply.
Key Responsibilities:
- Front-End Development:
- Develop responsive, high-performance web applications using Angular 15+.
- Ensure a seamless and engaging user experience by collaborating closely with UX/UI designers.
- Implement modern web technologies and best practices for building dynamic, scalable applications.
- Back-End Development:
- Build and maintain PHP-based server-side applications, ensuring reliability, security, and scalability.
- Work on backend systems using Node.js and PHP maintaining seamless integration, performance, and security across multiple services.
- Develop and integrate RESTful APIs to support front-end functionality.
- Design and optimize database schemas and queries for SQL databases (e.g., MySQL, PostgreSQL).
- Cloud and Infrastructure Integration:
- Integrate and manage cloud services, including AWS Lambda, AWS SQS, Firebase, and Google Cloud Tasks.
- Work with the team to ensure efficient cloud-based deployments and architecture optimization.
- Collaboration and Code Quality:
- Collaborate with cross-functional teams to define and implement software requirements.
- Ensure code quality and maintainability by conducting code reviews and following industry best practices.
- Write unit and integration tests to ensure software reliability and robustness.
- Continuous Improvement:
- Stay up to date with emerging technologies and trends in web development and cloud services.
- Identify and resolve performance bottlenecks and improve application performance.
Required Skills and Qualifications:
- 4-5 years of professional experience in full-stack web development.
- Proficiency in Angular 15+, PHP, Node.js and SQL databases (MySQL, PostgreSQL, etc.).
- Strong understanding of web application architecture, APIs, and cloud integration.
- Experience with version control tools like Git.
- Solid understanding of front-end build tools and optimization techniques.
Preferred Skills:
- Experience with Joomla 3+ and its framework.
- Familiarity with cloud platforms such as AWS Lambda, Firebase, and Google Cloud Tasks.
- Knowledge of other cloud services and serverless architectures.
- Experience with Cypress for end-to-end testing and test automation.
Education:
- Bachelor’s degree in Computer Science, Engineering, or related field, or equivalent work experience.
Job Title: Data Science Intern
Location: 6th Sector HSR Layout, Bangalore - Work from Office 5.5 Days
Duration: 3 Months | Stipend: Upto ₹12,000 per month
Post-Internship Offer (PPO): Available based on performance
🧑💻 About the Role
We are looking for a passionate and proactive Data Science Assistant Intern who is equally excited about mentoring learners and gaining hands-on experience with real-world data operations.
This is a 50% technical + 50% mentorship role that blends classroom support with practical data work. Ideal for those looking to build a career in EdTech and Applied Data Science.
🚀 What You'll Do
- Technical Responsibilities (50%)Create and manage dashboards using Python or BI tools like Power BI/Tableau
- Write and optimize SQL queries to extract and analyze backend data
- Support in data gathering, cleaning, and basic analysis
- Contribute to building data pipelines to assist internal decision-making and analytics
🚀Mentorship & Support (50%)
- Assist instructors during live Data Science sessions
- Solve doubts related to Python, Machine Learning, and Statistics
- Create and review quizzes, assignments, and other content
- Provide one-on-one academic support and mentoring
- Foster a positive and interactive learning environment
✅ Requirements
- Bachelor’s degree in Data Science, Computer Science, Statistics, or a related field
- Strong knowledge of:
- Python (Data Structures, Functions, OOP, Debugging)
- Pandas, NumPy, Matplotlib
- Machine Learning algorithms (scikit-learn)
- SQL and basic data wrangling
- APIs, Web Scraping, and Time-Series basics
- Advanced Excel: Lookup & reference (VLOOKUP, INDEX+MATCH, XLOOKUP), Logical functions (IF, AND, OR), Statistical & Aggregate Functions: (COUNTIFS, STDEV, PERCENTILE), Text cleanup (TRIM, SUBSTITUTE), Time functions (DATEDIF, NETWORKDAYS), Pivot Tables, Power Query, Conditional Formatting, Data Validation, What-If Analysis, and dynamic dashboards using charts & slicers.
- Excellent communication and interpersonal skills
- Prior mentoring, teaching, or tutoring experience is a big plus
- Passion for helping others learn and grow
1. Software Development Engineer - Salesforce
What we ask for
We are looking for strong engineers to build best in class systems for commercial &
wholesale banking at Bank, using Salesforce service cloud. We seek experienced
developers who bring deep understanding of salesforce development practices, patterns,
anti-patterns, governor limits, sharing & security model that will allow us to architect &
develop robust applications.
You will work closely with business, product teams to build applications which provide end
users with intuitive, clean, minimalist, easy to navigate experience
Develop systems by implementing software development principles and clean code
practices scalable, secure, highly resilient, have low latency
Should be open to work in a start-up environment and have confidence to deal with complex
issues keeping focus on solutions and project objectives as your guiding North Star
Technical Skills:
● Strong hands-on frontend development using JavaScript and LWC
● Expertise in backend development using Apex, Flows, Async Apex
● Understanding of Database concepts: SOQL, SOSL and SQL
● Hands-on experience in API integration using SOAP, REST API, graphql
● Experience with ETL tools , Data migration, and Data governance
● Experience with Apex Design Patterns, Integration Patterns and Apex testing
framework
● Follow agile, iterative execution model using CI-CD tools like Azure Devops, gitlab,
bitbucket
● Should have worked with at least one programming language - Java, python, c++
and have good understanding of data structures
Preferred qualifications
● Graduate degree in engineering
● Experience developing with India stack
● Experience in fintech or banking domain
Brandzzy is a forward-thinking technology company dedicated to building innovative and scalable Software-as-a-Service (SaaS) solutions. We are a passionate team focused on creating products that solve real-world problems and deliver exceptional user experiences. Join us as we scale our platform to new heights.
Role Summary:
We are seeking an experienced and visionary Senior Full Stack Developer to lead the technical design and development of our core SaaS platform. In this role, you will be responsible for making critical architectural decisions, mentoring other engineers, and ensuring our application is built for massive scale and high performance. You are not just a coder; you are a technical leader who will shape the future of our product and drive our engineering culture forward.
Key Responsibilities:
- Lead the architecture and design of highly scalable, secure, and resilient full-stack web applications.
- Take ownership of major features and system components, from technical strategy through to deployment and long-term maintenance.
- Mentor and guide junior and mid-level developers, conducting code reviews and fostering a culture of technical excellence.
- Drive technical strategy and make key decisions on technology stacks, frameworks, and infrastructure.
- Engineer and implement solutions specifically for SaaS scalability, including microservices, containerization (Docker, Kubernetes), and efficient cloud resource management.
- Establish and champion best practices for code quality, automated testing, and robust CI/CD pipelines.
- Collaborate with product leadership to translate business requirements into concrete technical roadmaps.
Skills & Qualifications:
- 5+ years of professional experience in full-stack development, with a proven track record of building and launching complex SaaS products.
- Deep expertise in both front-end (React, Angular, Vue.js) and back-end (Node.js, Python, Java, Go) technologies.
- Expert-level knowledge of designing and scaling applications on a major cloud platform (AWS, Azure, or GCP).
- Proven, hands-on experience architecting for scale, including deep knowledge of microservices architecture, message queues, and database scaling strategies (e.g., sharding, replication).
- In-depth understanding of database technologies (both SQL and NoSQL) and how to choose the right one for the job.
- Expertise in implementing and managing CI/CD pipelines and advocating for DevOps principles.
- Strong leadership and communication skills, with the ability to articulate complex technical ideas to both technical and non-technical stakeholders.
- A passion for solving complex problems and a proactive, self-starter attitude.
Job Description :
We are seeking a talented and experienced Full Stack Developer to join our dynamic team in Hyderabad. The ideal candidate will have a passion for building scalable and efficient web applications, a strong understanding of modern frameworks and technologies, and a keen eye for user experience and design.
Key Responsibilities :
- Design, develop, and maintain web-based applications using React JS, NodeJS, Angular, React Native, and other modern frameworks.
- Develop hybrid mobile applications and responsive web interfaces using Bootstrap and JavaScript.
- Build and optimize back-end services with frameworks such as Express.js or Restify.
- Work with SQL databases, including schema design and query optimization.
- Utilize ORM tools like Sequelize for database management.
- Implement real-time communication features and ensure browser compatibility.
- Collaborate with cross-functional teams to participate in the product development lifecycle, including prototyping, testing, and deployment.
- Adapt to and learn alternative technologies based on project requirements.
Required Skills & Experience :
- 3+ years of experience in full-stack web development.
- Proficient in Angular, NodeJS, React.JS, and JavaScript.
- Strong experience with Express.js or Restify frameworks.
- Solid understanding of SQL databases and ORM tools like Sequelize.
- Knowledge of responsive design principles and hands-on experience in developing responsive web applications.
- Familiarity with React Native for mobile development (a plus)
- Strong understanding of real-time communication technologies.
Additional Skills & Experience :
- Experience with NoSQL databases such as MongoDB or Cassandra.
- Awareness of internationalization (i18n) and the latest trends in UI/UX design.
- Familiarity with other JavaScript libraries/frameworks like VueJS.
- Hands-on experience with implementing payment gateways for different regions.
- Excellent facilitation, verbal, and written communication skills.
- Eagerness to contribute to functional and user experience design discussions.
Education : B.Tech/M.Tech in CSE/IT.ECE
• Software Engineering experience including hands-on experience with application development using Java and distributed technologies both on-premises and cloud.
• Strong in Java/JEE, Spring framework,MS, JavaScript, RESTful web services
• Strong understanding of microservices and associated design patterns
• Experience with latest unit testing tools including Junit
• Working knowledge SQL
• Experience In Microservices
• Experience in identifying and remediating security vulnerabilities
• Should be well versed with test driven development and be knowledgeable on associated tools and practices - CICD
AccioJob is conducting a Walk-In Hiring Drive with a Fintech Software Product Company at AccioJob Skill Centre – Noida for the position of Software Programmer Trainee.
To Apply, Register, and select your Slot here:
https://go.acciojob.com/FKNt9k
Required Skills: DSA, Java, SQL, Spring
Eligibility:
- Degree: B.Tech
- Branch: CS / IT
- Graduation Year: 2023, 2024 & 2025
Work Details:
- Work Mode: Work From Office
- Work Location: New Delhi
- CTC: ₹3.36 LPA
- Service Agreement: 2 years and 6 months (Original educational documents will be retained during this period)
Evaluation Process:
- Round 1: Offline Assessment at AccioJob Skill Centre – Noida
- Further Rounds (for shortlisted candidates only)
- Online Assessment – MCQ
- Online Assessment – Coding Test
- Technical Interview (Virtual / In-person)
Important Note: Please bring your laptop and earphones for the test.
Register here: https://go.acciojob.com/FKNt9k
Job Overview
We are looking for a detail-oriented and skilled QA Engineer with expertise in Cypress to join our Quality Assurance team. In this role, you will be responsible for creating and maintaining automated test scripts to ensure the stability and performance of our web applications. You’ll work closely with developers, product managers, and other QA professionals to identify issues early and help deliver a high-quality user experience.
You should have a strong background in test automation, excellent analytical skills, and a passion for improving software quality through efficient testing practices.
Key Responsibilities
- Develop, maintain, and execute automated test cases using Cypress.
- Design robust test strategies and plans based on product requirements and user stories.
- Work with cross-functional teams to identify test requirements and ensure proper coverage.
- Perform regression, integration, smoke, and exploratory testing as needed.
- Report and track defects, and work with developers to resolve issues quickly.
- Collaborate in Agile/Scrum development cycles and contribute to sprint planning and reviews.
- Continuously improve testing tools, processes, and best practices.
- Optimize test scripts for performance, reliability, and maintainability.
Required Skills & Qualifications
- Hands-on experience with Cypress and JavaScript-based test automation.
- Strong understanding of QA methodologies, tools, and processes.
- Experience in testing web applications across multiple browsers and devices.
- Familiarity with REST APIs and tools like Postman or Swagger.
- Experience with version control systems like Git.
- Knowledge of CI/CD pipelines and integrating automated tests (e.g., GitHub Actions, Jenkins).
- Excellent analytical and problem-solving skills.
- Strong written and verbal communication.
Preferred Qualifications
- Experience with other automation tools (e.g., Selenium, Playwright) is a plus.
- Familiarity with performance testing or security testing.
- Background in Agile or Scrum methodologies.
- Basic understanding of DevOps practices.
Hybrid work mode
(Azure) EDW Experience working in loading Star schema data warehouses using framework
architectures including experience loading type 2 dimensions. Ingesting data from various
sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures.
Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert),
Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.
Immediate Hiring for Business Analyst
Position: Business Analyst
Experiance : 5 - 8 Years
Location:Hyderabad
Job Summary:
We are seeking a motivated and detail-oriented Business Analyst with 5 years of experience in the Travel domain. The ideal candidate will have a strong understanding of the travel industry, including airlines, travel agencies, and online booking systems. You will work closely with cross-functional teams to gather business requirements, analyze processes, and deliver solutions that improve customer experience and operational efficiency.
Key Responsibilities:
- Requirement Gathering & Analysis: Collaborate with stakeholders to gather, document, and analyze business requirements, ensuring alignment with business goals.
- Process Improvement: Identify opportunities for process improvement and optimization in travel booking, ticketing, and customer support systems.
- Stakeholder Communication: Act as the bridge between the business stakeholders and technical teams, ensuring clear communication of requirements, timelines, and deliverables.
- Solution Design: Participate in the design and development of solutions, collaborating with IT and development teams to ensure business needs are met.
- Data Analysis: Analyze data related to customer journeys, bookings, and cancellations to identify trends and insights for decision-making.
- Documentation: Prepare detailed documentation including business requirements documents (BRD), user stories, process flows, and functional specifications.
- Testing & Validation: Support testing teams during User Acceptance Testing (UAT) to ensure solutions meet business needs, and facilitate issue resolution.
- Market Research: Stay up to date with travel industry trends, customer preferences, and competitor offerings to ensure innovative solutions are delivered.
Qualifications & Skills:
- Education: Bachelor’s degree in Business Administration, Information Technology, or a related field.
- Experience:
- 5 years of experience as a Business Analyst in the travel industry.
- Hands-on experience in working with travel booking systems (GDS, OTA) is highly preferred.
- Domain Knowledge:
- Strong understanding of the travel industry, including booking engines, reservations, ticketing, cancellations, and customer support.
- Familiarity with industry-specific regulations and best practices.
- Analytical Skills: Excellent problem-solving skills with the ability to analyze complex data and business processes.
- Technical Skills:
- Proficiency in Microsoft Office (Word, Excel, PowerPoint).
- Knowledge of SQL or data visualization tools (Power BI, Tableau) is a plus.
- Communication: Strong verbal and written communication skills with the ability to convey complex information clearly.
- Attention to Detail: Strong focus on accuracy and quality of work, ensuring that solutions meet business requirements.
Preferred:
- Prior experience with Agile methodologies.
- Certification in Business Analysis (CBAP or similar).
Data Scientist
Job Id: QX003
About Us:
QX impact was launched with a mission to make AI accessible and affordable and deliver AI Products/Solutions at scale for enterprises by bringing the power of Data, AI, and Engineering to drive digital transformation. We believe without insights, businesses will continue to face challenges to better understand their customers and even lose them; Secondly, without insights businesses won't’ be able to deliver differentiated products/services; and finally, without insights, businesses can’t achieve a new level of “Operational Excellence” is crucial to remain competitive, meeting rising customer expectations, expanding markets, and digitalization.
Position Overview:
We are seeking a collaborative and analytical Data Scientist who can bridge the gap between business needs and data science capabilities. In this role, you will lead and support projects that apply machine learning, AI, and statistical modeling to generate actionable insights and drive business value.
Key Responsibilities:
- Collaborate with stakeholders to define and translate business challenges into data science solutions.
- Conduct in-depth data analysis on structured and unstructured datasets.
- Build, validate, and deploy machine learning models to solve real-world problems.
- Develop clear visualizations and presentations to communicate insights.
- Drive end-to-end project delivery, from exploration to production.
- Contribute to team knowledge sharing and mentorship activities.
Must-Have Skills:
- 3+ years of progressive experience in data science, applied analytics, or a related quantitative role, demonstrating a proven track record of delivering impactful data-driven solutions.
- Exceptional programming proficiency in Python, including extensive experience with core libraries such as Pandas, NumPy, Scikit-learn, NLTK and XGBoost.
- Expert-level SQL skills for complex data extraction, transformation, and analysis from various relational databases.
- Deep understanding and practical application of statistical modeling and machine learning techniques, including but not limited to regression, classification, clustering, time series analysis, and dimensionality reduction.
- Proven expertise in end-to-end machine learning model development lifecycle, including robust feature engineering, rigorous model validation and evaluation (e.g., A/B testing), and model deployment strategies.
- Demonstrated ability to translate complex business problems into actionable analytical frameworks and data science solutions, driving measurable business outcomes.
- Proficiency in advanced data analysis techniques, including Exploratory Data Analysis (EDA), customer segmentation (e.g., RFM analysis), and cohort analysis, to uncover actionable insights.
- Experience in designing and implementing data models, including logical and physical data modeling, and developing source-to-target mappings for robust data pipelines.
- Exceptional communication skills, with the ability to clearly articulate complex technical findings, methodologies, and recommendations to diverse business stakeholders (both technical and non-technical audiences).
- Experience in designing and implementing data models, including logical and physical data modeling, and developing source-to-target mappings for robust data pipelines.
- Exceptional communication skills, with the ability to clearly articulate complex technical findings, methodologies, and recommendations to diverse business stakeholders (both technical and non-technical audiences).
Good-to-Have Skills:
- Experience with cloud platforms (Azure, AWS, GCP) and specific services like Azure ML, Synapse, Azure Kubernetes and Databricks.
- Familiarity with big data processing tools like Apache Spark or Hadoop.
- Exposure to MLOps tools and practices (e.g., MLflow, Docker, Kubeflow) for model lifecycle management.
- Knowledge of deep learning libraries (TensorFlow, PyTorch) or experience with Generative AI (GenAI) and Large Language Models (LLMs).
- Proficiency with business intelligence and data visualization tools such as Tableau, Power BI, or Plotly.
- Experience working within Agile project delivery methodologies.
Competencies:
· Tech Savvy - Anticipating and adopting innovations in business-building digital and technology applications.
· Self-Development - Actively seeking new ways to grow and be challenged using both formal and informal development channels.
· Action Oriented - Taking on new opportunities and tough challenges with a sense of urgency, high energy, and enthusiasm.
· Customer Focus - Building strong customer relationships and delivering customer-centric solutions.
· Optimizes Work Processes - Knowing the most effective and efficient processes to get things done, with a focus on continuous improvement.
Why Join Us?
- Be part of a collaborative and agile team driving cutting-edge AI and data engineering solutions.
- Work on impactful projects that make a difference across industries.
- Opportunities for professional growth and continuous learning.
- Competitive salary and benefits package.

Role : AIML Engineer
Location : Madurai
Experience : 5 to 10 Yrs
Mandatory Skills : AIML, Python, SQL, ML Models, PyTorch, Pandas, Docker, AWS
Language: Python
DBs : SQL
Core Libraries:
Time Series & Forecasting: pmdarima, statsmodels, Prophet, GluonTS, NeuralProphet
SOTA ML : ML Models, Boosting & Ensemble models etc.
Explainability : Shap / Lime
Required skills:
- Deep Learning: PyTorch, PyTorch Forecasting,
- Data Processing: Pandas, NumPy, Polars (optional), PySpark
- Hyperparameter Tuning: Optuna, Amazon SageMaker Automatic Model Tuning
- Deployment & MLOps: Batch & Realtime with API endpoints, MLFlow
- Serving: TorchServe, Sagemaker endpoints / batch
- Containerization: Docker
- Orchestration & Pipelines: AWS Step Functions, AWS SageMaker Pipelines
AWS Services:
- SageMaker (Training, Inference, Tuning)
- S3 (Data Storage)
- CloudWatch (Monitoring)
- Lambda (Trigger-based Inference)
- ECR, ECS or Fargate (Container Hosting)
Key Responsibilities:
● Design, develop, and maintain scalable web applications using .NET Core, .NET
Framework, C#, and related technologies.
● Participate in all phases of the SDLC, including requirements gathering, architecture
design, coding, testing, deployment, and support.
● Build and integrate RESTful APIs, and work with SQL Server, Entity Framework, and
modern front-end technologies such as Angular, React, and JavaScript.
● Conduct thorough code reviews, write unit tests, and ensure adherence to coding
standards and best practices.
● Lead or support .NET Framework to .NET Core migration initiatives, ensuring
minimal disruption and optimal performance.
● Implement and manage CI/CD pipelines using tools like Azure DevOps, Jenkins, or
GitLab CI/CD.
● Containerize applications using Docker and deploy/manage them on orchestration
platforms like Kubernetes or GKE.
● Lead and execute database migration projects, particularly transitioning from SQL
Server to PostgreSQL.
● Manage and optimize Cloud SQL for PostgreSQL, including configuration, tuning, and
ongoing maintenance.
● Leverage Google Cloud Platform (GCP) services such as GKE, Cloud SQL, Cloud
Run, and Dataflow to build and maintain cloud-native solutions.
● Handle schema conversion and data transformation tasks as part of migration and
modernization efforts.
Required Skills & Experience:
● 5+ years of hands-on experience with C#, .NET Core, and .NET Framework.
● Proven experience in application modernization and cloud-native development.
● Strong knowledge of containerization (Docker) and orchestration tools like
Kubernetes/GKE.
● Expertise in implementing and managing CI/CD pipelines.
● Solid understanding of relational databases and experience in SQL Server to
PostgreSQL migrations.
● Familiarity with cloud infrastructure, especially GCP services relevant to application
hosting and data processing.
● Excellent problem-solving, communication,
- A minimum of 4-10 years of experience into data integration/orchestration services, service architecture and providing data driven solutions for client requirements
- Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning.
- Experience with Qlik Replicate and Compose tools(Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool is must
- Exposure to the financial domain knowledge is considered a plus.
- Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus.
- Prior experience with State Street and Charles River Development ( CRD) considered a plus.
- Experience in tools such as Visio, PowerPoint, Excel.
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus.
- Strong SQL knowledge and debugging skills is a must.
We are seeking a passionate and experienced Java Full Stack Trainer to deliver high-quality training in front-end and back-end technologies. The ideal candidate will have hands-on experience in full stack development and a flair for teaching aspiring developers and students.
Job description
🔧 Key Responsibilities:
- Design and implement robust backend services using Node.js.
- Develop and maintain RESTful APIs to support front-end applications and third-party integrations
- Manage and optimize SQL/NoSQL databases (e.g., PostgreSQL, MongoDB, Snowflake)
- Collaborate with front-end developers to ensure seamless integration and data flow
- Implement caching, logging, and monitoring strategies for performance and reliability
- Ensure application security, scalability, and maintainability
- Participate in code reviews, architecture discussions, and agile ceremonies
✅ Required Skills:
- Proficiency in backend programming languages (Node.js, Java, .NET Core)
- Experience with API development and tools like Postman, Swagger
- Strong understanding of database design and query optimization
- Familiarity with microservices architecture and containerization (Docker, Kubernetes)
- Knowledge of cloud platforms (Azure, AWS) and CI/CD pipelines.
Profile - SQL Database
Experience - 5 Years
Location - Bangalore (5 days working)
Mandatory skills - SQL, Stored Procedures, MySQL
Notice Period - Immediate Joiners
Job Description -
- Strong experience in SQL
- Experience in Databases - MySQL & PostgresSQL.
- Experience in writing & adjusting Stored Procedures, T-SQL.
- Experience in Query optimization Index creation, SQL Joins, Sub-Queries.
Role Overview
We are seeking an enthusiastic Software Engineer (Full Stack) with 3- 5 years of experience to join our growing team in Noida. In this role, you will play a pivotal part in designing and developing innovative SaaS products, contributing across both backend and frontend stacks, and helping drive digital transformation in a traditionally paper-driven domain.
This is a high-impact role where your work will directly influence the future of rare disease diagnostics in India and globally.
Responsibilities
- Contribute to all facets of software development and design
- Break down complex problems into modular, scalable solutions
- Write clean, efficient, well-commented code (Java, JavaScript, HTML, CSS)
- Develop robust and scalable web applications using Java Spring Boot and Hibernate
- Build and maintain RESTful APIs and integrate third-party services
- Work with MySQL databases and write optimized SQL queries
- Collaborate in Agile teams—participate in sprint planning, stand-ups, retrospectives
- Perform bug fixes, troubleshoot technical issues, and ensure system stability
- Design technical documentation and automated test suites
- Stay updated with emerging technologies and tools relevant to the tech stack
- Maintain a productive in-office work environment with essential infrastructure
Requirements
Experience
- 3-5 years of hands-on software development
- Experience working in Agile teams using tools like Azure DevOps or JIRA
Technical Skills
- Proficient in: Java, Spring Boot, Hibernate, MySQL, RESTful APIs
- Strong front-end development using JavaScript, HTML, CSS
- Solid understanding of Object-Oriented Design and common design patterns
- Skilled in SQL and database interactions
- Experience with token-based authentication and test-driven development
- Familiarity with Git for version control
Soft Skills
- Strong analytical and troubleshooting abilities
- Excellent communication and team collaboration skills
About Eazeebox
Eazeebox is India’s first specialized B2B platform for home electrical goods. We simplify supply chain logistics and empower electrical retailers through our one-stop digital platform — offering access to 100+ brands across 15+ categories, no MOQs, flexible credit options, and 4-hour delivery. We’re on a mission to bring technological inclusion to India's massive electrical retail industry.
Role Overview
We’re looking for a hands-on Full Stack Engineer who can build scalable backend systems using Python and mobile applications using React Native. You’ll work directly with the founder and a lean engineering team to architect and deliver core modules across our Quick Commerce stack – including retailer apps, driver apps, order management systems, and more.
What You’ll Do
- Develop and maintain backend services using Python
- Build and ship high-performance React Native apps for Android and iOS
- Collaborate on API design, microservices, and systems integration
- Ensure performance, reliability, and scalability across the stack
- Contribute to decisions on re-engineering, tech stack, and infra setup
- Work closely with the founder and product team to own end-to-end delivery
- Participate in collaborative working sessions and pair programming when needed
What We’re Looking For
- Strong proficiency in Python for backend development
- Experience building mobile apps with React Native
- Solid understanding of microservices architecture, API layers, and shared data models
- Familiarity with AWS or equivalent cloud platforms
- Exposure to Docker, Kubernetes, and CI/CD pipelines
- Ability to thrive in a fast-paced, high-ownership environment
Good-to-Have (Bonus Points)
- Experience working in Quick Commerce, logistics, or consumer apps
- Knowledge of PIM (Product Information Management) systems
- Understanding of key commerce algorithms (search, ranking, filtering, order management)
- Ability to use AI-assisted coding tools to speed up development
Why Join Us
- Build from scratch, not maintain legacy
- Work directly with the founder and influence tech decisions
- Shape meaningful digital infrastructure for a $35B+ industry
Role : Java Developer (2-7 years)
Location : Bangalore
Key responsibilities
- Develop and maintain high-quality, efficient, and scalable backend applications.
- Participate in all phases of the software development lifecycle (SDLC)
- Write clean, well-documented, and testable code adhering to best practices.
- Collaborate with team members to ensure the successful delivery of projects.
- Debug and troubleshoot complex technical problems.
- Identify and implement performance optimizations.
- Participate in code reviews
- Hands-on experience with Spring boot, Java 8 and above.
- 2-7 years of experience developing Java applications.
- Knowledge about at least one messaging system like Kafka, RabbitMQ etc.
- Required React developer requirements, qualifications & skills:
- Proficiency in React.js and its core principles
- Strong JavaScript, HTML5, and CSS3 skills
- Experience with popular React.js workflows (such as Redux)
- Strong understanding of object-oriented programming (OOP) principles.
- Experience with design patterns and best practices for Java development.
- Proficient in unit testing frameworks (e.g., JUnit).
- Experience with build automation tools (e.g., Maven, Gradle).
- Experience with version control systems (e.g., Git).
- Experience with one of these databases – Postgres, MongoDb, Cassandra
- Knowledge on Retail or OMS is a plus.
- Experienced in containerized deployments using Docker, Kubernetes and DevOps mindset
- Ability to reverse engineer existing/legacy and document findings on confluence.
- Create automated tests for unit, integration, regression, performance, and functional testing, to meet established expectations and acceptance criteria.
About the Role
We are looking for a Python Developer with expertise in data synchronization (ETL & Reverse ETL), automation workflows, AI functionality, and connectivity to work directly with a customer in Peliqan. In this role, you will be responsible for building seamless integrations, enabling AI-driven functionality, and ensuring data flows smoothly across various systems.
Key Responsibilities
- Build and maintain data sync pipelines (ETL & Reverse ETL) to ensure seamless data transfer between platforms.
- Develop automation workflows to streamline processes and improve operational efficiency.
- Implement AI-driven functionality, including AI-powered analytics, automation, and decision-making capabilities.
- Build and enhance connectivity between different data sources, APIs, and enterprise applications.
- Work closely with the customer to understand their technical needs and design tailored solutions in Peliqan.
- Optimize performance of data integrations and troubleshoot issues as they arise.
- Ensure security and compliance in data handling and integrations.
Requirements
- Strong experience in Python and related libraries for data processing & automation.
- Expertise in ETL, Reverse ETL, and workflow automation tools.
- Experience working with APIs, data connectors, and integrations across various platforms.
- Familiarity with AI & machine learning concepts and their practical application in automation.
- Hands-on experience with Peliqan or similar integration/data automation platforms is a plus.
- Strong problem-solving skills and the ability to work directly with customers to define and implement solutions.
- Excellent communication and collaboration skills.
Preferred Qualifications
- Experience in SQL, NoSQL databases, and cloud platforms (AWS, GCP, Azure).
- Knowledge of data governance, security best practices, and performance optimization.
- Prior experience in customer-facing engineering roles.
If you’re a Python & Integration Engineer who loves working on cutting-edge AI, automation, and data connectivity projects, we’d love to hear from you