50+ Startup Jobs in Pune | Startup Job openings in Pune
Apply to 50+ Startup Jobs in Pune on CutShort.io. Explore the latest Startup Job opportunities across top companies like Google, Amazon & Adobe.
Salesforce Tech Lead
6+ yrs
Any 360 location - Hybrid
strong in LWC
Must have experience in Aura
Strong in Integration
Strong in Deployment
Must be able to handle complex data
Should be able to manage a team
Should understand Integration
Must be able to coordinate with multiple stakeholders
Strong Data engineer profile
Mandatory (Experience 1): Must have 6 months+ of hands-on Data Engineering experience.
Mandatory (Experience 2): Must have end-to-end experience in building & maintaining ETL/ELT pipelines (not just BI/reporting).
Mandatory (Technical): Must have strong SQL capability
Preferred
Preferred (Experience): Worked on Call center data
Job Specific Criteria
CV Attachment is mandatory
Have you used Databricks or any notebook environment?
Have you worked on ETL/ELT workflow?
We have an alternate Saturdays working. Are you comfortable to WFH on 1st and 4th Saturday?
Role: Azure Fabric Data Engineer
Experience: 5–10 Years
Location: Pune/Bangalore
Employment Type: Full-Time
About the Role
We are looking for an experienced Azure Data Engineer with strong expertise in Microsoft Fabric and Power BI to build scalable data pipelines, Lakehouse architectures, and enterprise analytics solutions on the Azure cloud.
Key Responsibilities
- Design & build data pipelines using Microsoft Fabric (Pipelines, Dataflows Gen2, Notebooks).
- Develop and optimize Lakehouse / Data Lake / Delta Lake architectures.
- Build ETL/ELT workflows using Fabric, Azure Data Factory, or Synapse.
- Create and optimize Power BI datasets, data models, and DAX calculations.
- Implement semantic models, incremental refresh, and Direct Lake/DirectQuery.
- Work with Azure services: ADLS Gen2, Azure SQL, Synapse, Event Hub, Functions, Databricks.
- Build dimensional models (Star/Snowflake) and support BI teams.
- Ensure data governance & security using Purview, RBAC, and AAD.
Required Skills
- Strong hands-on experience with Microsoft Fabric (Lakehouse, Pipelines, Dataflows, Notebooks).
- Expertise in Power BI (DAX, modeling, Dataflows, optimized datasets).
- Deep knowledge of Azure Data Engineering stack (ADF, ADLS, Synapse, SQL).
- Strong SQL, Python/PySpark skills.
- Experience in Delta Lake, Medallion architecture, and data quality frameworks.
Nice to Have
- Azure Certifications (DP-203, PL-300, Fabric Analytics Engineer).
- Experience with CI/CD (Azure DevOps/GitHub).
- Databricks experience (preferred).
Note: One Technical round is mandatory to be taken F2F from either Pune or Bangalore office
About the role:
We are looking for a Senior Site Reliability Engineer who understands the nuances of production systems. If you care about building and running reliable software systems in production, you'll like working at One2N.
You will primarily work with our startups and mid-size clients. We work on One-to-N kind problems (hence the name One2N), those where Proof of concept is done and the work revolves around scalability, maintainability, and reliability. In this role, you will be responsible for architecting and optimizing our observability and infrastructure to provide actionable insights into performance and reliability.
Responsibilities:
- Conceptualise, think, and build platform engineering solutions with a self-serve model to enable product engineering teams.
- Provide technical guidance and mentorship to young engineers.
- Participate in code reviews and contribute to best practices for development and operations.
- Design and implement comprehensive monitoring, logging, and alerting solutions to collect, analyze, and visualize data (metrics, logs, traces) from diverse sources.
- Develop custom monitoring metrics, dashboards, and reports to track key performance indicators (KPIs), detect anomalies, and troubleshoot issues proactively.
- Improve Developer Experience (DX) to help engineers improve their productivity.
- Design and implement CI/CD solutions to optimize velocity and shorten the delivery time.
- Help SRE teams set up on-call rosters and coach them for effective on-call management.
- Automating repetitive manual tasks from CI/CD pipelines, operations tasks, and infrastructure as code (IaC) practices.
- Stay up-to-date with emerging technologies and industry trends in cloud-native, observability, and platform engineering space.
Requirements:
- 6-9 years of professional experience in DevOps practices or software engineering roles, with a focus on Kubernetes on an AWS platform.
- Expertise in observability and telemetry tools and practices, including hands-on experience with some of Datadog, Honeycomb, ELK, Grafana, and Prometheus.
- Working knowledge of programming using Golang, Python, Java, or equivalent.
- Skilled in diagnosing and resolving Linux operating system issues.
- Strong proficiency in scripting and automation to build monitoring and analytics solutions.
- Solid understanding of microservices architecture, containerization (Docker, Kubernetes), and cloud-native technologies.
- Experience with infrastructure as code (IaC) tools such as Terraform, Pulumi.
- Excellent analytical and problem-solving skills, keen attention to detail, and a passion for continuous improvement.
- Strong written, communication, and collaboration skills, with the ability to work effectively in a fast-paced, agile environment.
Review Criteria
- Strong Senior Data Engineer profile
- 4+ years of hands-on Data Engineering experience
- Must have experience owning end-to-end data architecture and complex pipelines
- Must have advanced SQL capability (complex queries, large datasets, optimization)
- Must have strong Databricks hands-on experience
- Must be able to architect solutions, troubleshoot complex data issues, and work independently
- Must have Power BI integration experience
- CTC has 80% fixed and 20% variable in their ctc structure
Preferred
- Worked on Call center data, understand nuances of data generated in call centers
- Experience implementing data governance, quality checks, or lineage frameworks
- Experience with orchestration tools (Airflow, ADF, Glue Workflows), Python, Delta Lake, Lakehouse architecture
Job Specific Criteria
- CV Attachment is mandatory
- Are you Comfortable integrating with Power BI datasets?
- We have an alternate Saturdays working. Are you comfortable to WFH on 1st and 4th Saturday?
Role & Responsibilities
We are seeking a highly experienced Senior Data Engineer with strong architectural capability, excellent optimisation skills, and deep hands-on experience in modern data platforms. The ideal candidate will have advanced SQL skills, strong expertise in Databricks, and practical experience working across cloud environments such as AWS and Azure. This role requires end-to-end ownership of complex data engineering initiatives, including architecture design, data governance implementation, and performance optimisation. You will collaborate with cross-functional teams to build scalable, secure, and high-quality data solutions.
Key Responsibilities-
- Lead the design and implementation of scalable data architectures, pipelines, and integration frameworks.
- Develop, optimise, and maintain complex SQL queries, transformations, and Databricks-based data workflows.
- Architect and deliver high-performance ETL/ELT processes across cloud platforms.
- Implement and enforce data governance standards, including data quality, lineage, and access control.
- Partner with analytics, BI (Power BI), and business teams to enable reliable, governed, and high-value data delivery.
- Optimise large-scale data processing, ensuring efficiency, reliability, and cost-effectiveness.
- Monitor, troubleshoot, and continuously improve data pipelines and platform performance.
- Mentor junior engineers and contribute to engineering best practices, standards, and documentation.
Ideal Candidate
- Proven industry experience as a Senior Data Engineer, with ownership of high-complexity projects.
- Advanced SQL skills with experience handling large, complex datasets.
- Strong expertise with Databricks for data engineering workloads.
- Hands-on experience with major cloud platforms — AWS and Azure.
- Deep understanding of data architecture, data modelling, and optimisation techniques.
- Familiarity with BI and reporting environments such as Power BI.
- Strong analytical and problem-solving abilities with a focus on data quality and governance
- Proficiency in python or another programming language in a plus.
ROLES AND RESPONSIBILITIES:
We are seeking a highly experienced Senior Data Engineer with strong architectural capability, excellent optimisation skills, and deep hands-on experience in modern data platforms. The ideal candidate will have advanced SQL skills, strong expertise in Databricks, and practical experience working across cloud environments such as AWS and Azure. This role requires end-to-end ownership of complex data engineering initiatives, including architecture design, data governance implementation, and performance optimisation. You will collaborate with cross-functional teams to build scalable, secure, and high-quality data solutions.
Key Responsibilities-
- Lead the design and implementation of scalable data architectures, pipelines, and integration frameworks.
- Develop, optimise, and maintain complex SQL queries, transformations, and Databricks-based data workflows.
- Architect and deliver high-performance ETL/ELT processes across cloud platforms.
- Implement and enforce data governance standards, including data quality, lineage, and access control.
- Partner with analytics, BI (Power BI), and business teams to enable reliable, governed, and high-value data delivery.
- Optimise large-scale data processing, ensuring efficiency, reliability, and cost-effectiveness.
- Monitor, troubleshoot, and continuously improve data pipelines and platform performance.
- Mentor junior engineers and contribute to engineering best practices, standards, and documentation.
IDEAL CANDIDATE:
- Proven industry experience as a Senior Data Engineer, with ownership of high-complexity projects.
- Advanced SQL skills with experience handling large, complex datasets.
- Strong expertise with Databricks for data engineering workloads.
- Hands-on experience with major cloud platforms — AWS and Azure.
- Deep understanding of data architecture, data modelling, and optimisation techniques.
- Familiarity with BI and reporting environments such as Power BI.
- Strong analytical and problem-solving abilities with a focus on data quality and governance
- Proficiency in python or another programming language in a plus.
PERKS, BENEFITS AND WORK CULTURE:
Our people define our passion and our audacious, incredibly rewarding achievements. The company is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.
ROLES AND RESPONSIBILITIES:
We are looking for a Junior Data Engineer who will work under guidance to support data engineering tasks, perform basic coding, and actively learn modern data platforms and tools. The ideal candidate should have foundational SQL knowledge, basic exposure to Databricks. This role is designed for early-career professionals who are eager to grow into full data engineering responsibilities while contributing to data pipeline operations and analytical support.
Key Responsibilities-
- Support the development and maintenance of data pipelines and ETL/ELT workflows under mentorship.
- Write basic SQL queries, transformations, and assist with Databricks notebook tasks.
- Help troubleshoot data issues and contribute to ensuring pipeline reliability.
- Work with senior engineers and analysts to understand data requirements and deliver small tasks.
- Assist in maintaining documentation, data dictionaries, and process notes.
- Learn and apply data engineering best practices, coding standards, and cloud fundamentals.
- Support basic tasks related to Power BI data preparation or integrations as needed.
IDEAL CANDIDATE:
- Foundational SQL skills with the ability to write and understand basic queries.
- Basic exposure to Databricks, data transformation concepts, or similar data tools.
- Understanding of ETL/ELT concepts, data structures, and analytical workflows.
- Eagerness to learn modern data engineering tools, technologies, and best practices.
- Strong problem-solving attitude and willingness to work under guidance.
- Good communication and collaboration skills to work with senior engineers and analysts.
PERKS, BENEFITS AND WORK CULTURE:
Our people define our passion and our audacious, incredibly rewarding achievements. Bajaj Finance Limited is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.
- Strong Senior Data Engineer profile
- Mandatory (Experience 1): Must have 4+ years of hands-on Data Engineering experience
- Mandatory (Experience 2): Must have experience owning end-to-end data architecture and complex pipelines
- Mandatory (Technical 1): Must have advanced SQL capability (complex queries, large datasets, optimization)
- Mandatory (Technical 2): Must have strong Databricks hands-on experience
- Mandatory (Role Requirement): Must be able to architect solutions, troubleshoot complex data issues, and work independently
- Mandatory (BI Requirement): Must have Power BI integration experience
- Mandatory (Note): Bajaj CTC has 80% fixed and 20% variable
Strong Data engineer profile
Mandatory (Experience 1): Must have 6 months+ of hands-on Data Engineering experience.
Mandatory (Experience 2): Must have end-to-end experience in building & maintaining ETL/ELT pipelines (not just BI/reporting).
Mandatory (Technical): Must have strong SQL capability
ROLES AND RESPONSIBILITIES:
Video-led Content Strategy:
- Implement the video-led content strategy to meet business objectives like increase in views, leads, product awareness and CTR
- Execute the video content framework that talks to a varied TG, mixes formats and languages (English and vernacular)
Production:
- Ability to write/edit clear and concise copy and briefs unique to the platform, and influence/direct designers and agencies for creative output
- Creating the monthly production pipeline, review and edit every piece of video content that is produced
- Explore AI-based and production automation tools to help create communication stimuli at scale
- Manage our agency and partner ecosystem to support high-scale video production
- Manage the monthly production flow and maintain data sheets to enable ease of tracking
- Increase CTR, Views and other critical metrics for all video production
Project Management:
- Oversee the creation of various video formats, including explainer videos, product demos, customer testimonials, etc.
- Plan and manage video production schedules, ensuring timely delivery of projects within budget
- Upload and manage video content across multiple digital platforms, including the digital platforms and other relevant platforms
- Ensure all video content is optimised for each platform, following best practices for SEO and audience engagement
- Coordinate with the content team to integrate video content on the platforms
- Maintain an archive of video assets and ensure proper documentation and tagging
Capabilities:
- Drive the development of capabilities around production, automation, upload, thereby leading to a reduction in TAT and effort
- Work with technology teams to explore Gen AI tools to deliver output at scale and speed
- Identifying opportunities for new formats and keeping up with trends in the video content space
Customer obsession and governance:
- Relentless focus on making customer interactions non-intrusive; using video content to create a frictionless experience
- Zero tolerance for content and communication errors
- Develop a comprehensive video guidelines framework that is easy to use by businesses yet creates a distinct identity for the brand
- Have a strong eye for grammar and ensure that every content unit adheres to the brand tone of voice
- Create checks and balances in the system so that all customer-facing content is first time right, every time
Performance tracking:
- Tracking and analysing production, go-live status and engagement metrics using tools like Google Analytics, etc.
- Gauge efficacy of video content produced, and drive changes wherever needed
- Provide regular reports on video performance, identifying trends, insights, and areas for improvement
IDEAL CANDIDATE:
Qualifications:
- Bachelors degree in Communications, Digital Marketing, Advertising or a related field
- Proven experience as a creative/content writer or in a similar role, preferably with exposure to AI-driven content creation.
Work Experience:
- 3-5 years of relevant experience in the space of content marketing/advertising, experience in Digital Marketing with a focus on video content, will be an advantage
Skills :
- Excellent command over the English language
- Hands-on experience of copywriting, editing, and creating communication
- Ability to handle complex briefs and ideate out of the box
- Creative thinking and problem-solving skills, with a passion for storytelling and visual communication
- Deep customer focus by understanding customer behaviour and analysing data & real-world experiences
- Detailed orientation & very structured thinking, think of customers' entire journey and experience
- Strong communication and collaboration skills to effectively work with diverse teams
- Passion for emerging technologies and the ability to adapt to a fast-paced and evolving environment
- Excellent project management skills, with the ability to manage multiple projects simultaneously and meet tight deadlines
- Proficiency in AI tools and video editing software (e.g., Adobe Premiere Pro, Final Cut Pro) and familiarity with graphic design software (e.g., Adobe After Effects, Photoshop)
PERKS, BENEFITS AND WORK CULTURE:
Our people define our passion and our audacious, incredibly rewarding achievements. Bajaj Finance Limited is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.
ROLES AND RESPONSIBILITIES:
We are seeking a skilled Data Engineer who can work independently on data pipeline development, troubleshooting, and optimisation tasks. The ideal candidate will have strong SQL skills, hands-on experience with Databricks, and familiarity with cloud platforms such as AWS and Azure. You will be responsible for building and maintaining reliable data workflows, supporting analytical teams, and ensuring high-quality, secure, and accessible data across the organisation.
KEY RESPONSIBILITIES:
- Design, develop, and maintain scalable data pipelines and ETL/ELT workflows.
- Build, optimise, and troubleshoot SQL queries, transformations, and Databricks data processes.
- Work with large datasets to deliver efficient, reliable, and high-performing data solutions.
- Collaborate closely with analysts, data scientists, and business teams to support data requirements.
- Ensure data quality, availability, and security across systems and workflows.
- Monitor pipeline performance, diagnose issues, and implement improvements.
- Contribute to documentation, standards, and best practices for data engineering processes.
IDEAL CANDIDATE:
- Proven experience as a Data Engineer or in a similar data-focused role (3+ years).
- Strong SQL skills with experience writing and optimising complex queries.
- Hands-on experience with Databricks for data engineering tasks.
- Experience with cloud platforms such as AWS and Azure.
- Understanding of ETL/ELT concepts, data modelling, and pipeline orchestration.
- Familiarity with Power BI and data integration with BI tools.
- Strong analytical and troubleshooting skills, with the ability to work independently.
- Experience working end-to-end on data engineering workflows and solutions.
PERKS, BENEFITS AND WORK CULTURE:
Our people define our passion and our audacious, incredibly rewarding achievements. The company is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.
About the Role:
We're looking for a Staff Software Engineer who will work across teams to design and implement robust solutions, mentor other engineers, and drive technical excellence. If you care about building and running reliable software systems in production, you'll like working at One2N.
You'll primarily work with our enterprise customers on One-to-N kind problems, where you:
- Design, build and scale reliable software systems that handle real-world production demands
- Solve technical challenges around performance bottlenecks and system scaling
- Build and scale platforms for high throughput and low latency
Key Responsibilities:
- Lead end-to-end architecture and design for large-scale systems and technical initiatives
- Create and maintain technical documentation through Architecture Design Records (ADR) and Request For Comments (RFC)
- Participate in architecture reviews and help teams make better technical decisions
- Drive improvements in system reliability, performance, and scalability
- Tackle complex technical challenges, especially around bottlenecks and scaling
- Mentor engineers through 1:1s, code reviews, and technical guidance
- Design high-throughput, low-latency systems that can scale
- Contribute to shared libraries, internal SDKs, and developer tooling to improve engineering efficiency
About You:
- 9+ years of professional programming experience with JVM languages (Java/Scala/Kotlin)
- Lead by example with high-quality, maintainable, and testable code and good architecture decisions
- Strong experience with:
- REST API design and implementation
- Spring Boot framework
- Database schema modeling
- Test-Driven Development (TDD)
- Domain Driven Design
- Experience in building microservices architectures
- Strong testing skills with Unit and Integration testing. Good to have experience with Contract-driven tests.
- Ability to build reusable libraries/SDKs
- Brainstorm with product and business teams to prioritize the backlog
- Experience with Kafka for event-driven architectures to build auto-scaling data processing pipelines
- Exposure to AWS architecture, particularly with EKS and S3
- Understanding architecture implications for cost, security, and performance
Leadership & Communication:
- Track record of mentoring and coaching engineers
- Strong written and verbal communication skills
- Ability to influence without authority
Strong Data engineer profile
Mandatory (Experience 1): Must have 2+ years of hands-on Data Engineering experience.
Mandatory (Experience 2): Must have end-to-end experience in building & maintaining ETL/ELT pipelines (not just BI/reporting).
Mandatory (Technical 1): Must have strong SQL capability (complex queries + optimization).
Mandatory (Technical 2): Must have hands-on Databricks experience.
Mandatory (Role Requirement): Must be able to work independently, troubleshoot data issues, and manage large datasets.
As a Lead Artificial Intelligence Engineer at Leapfrog Technology, you will be at the forefront of shaping the future of data-driven solutions. You'll lead a talented team, drive the development of innovative AI projects, and work collaboratively across functions to turn complex business challenges into actionable insights.
Key Responsibilities:
● Leadership Excellence: Lead and inspire a team of AI Engineers and Data Scientists, fostering a culture of innovation, collaboration, and continuous growth.
● End-to-End Ownership: Take full ownership of the AI project lifecycle, from ideation and design to development, deployment, and maintenance.
● Technological Innovation: Explore and assess emerging technologies to enhance the performance, maintainability, and reliability of AI systems.
● Engineering Best Practices: Apply robust software engineering practices to AI, including CI/CD pipelines, automation, and quality assurance.
● Architectural Leadership: Collaborate with technology experts to make informed architectural decisions, and ensure thorough technical documentation.
● Risk Mitigation: Proactively identify and address project risks, conduct root cause analysis, and implement preventive measures.
● Cross-functional Collaboration: Engage closely with cross-functional teams, including business stakeholders, product managers, software engineers, and data engineers, to deliver impactful data-driven solutions.
● Continuous Learning: Stay at the cutting edge of data science, ML, and AI developments, and leverage emerging technologies to solve complex business problems.
● Mentorship and Growth: Coach and motivate team members, identify training needs, and foster their professional development.
● Organizational Excellence: Actively uphold and promote the company's culture, processes, and standards to ensure consistent excellence in our work.
Job requirements
Education and Experience:
- A degree (Masters preferred) in Computer Science, Engineering, Artificial Intelligence, Data Science, Applied Mathematics, or related fields.
- Minimum 6+ years of hands-on experience in AI/ML or Data Science, preferably in real industry settings, with a track record of building data products that have positively impacted customer satisfaction and revenue growth.
Technical Skills:
- Proficiency in a wide range of Machine Learning techniques and algorithms, with the ability to apply advanced analytics methods, including Bayesian statistics, clustering, text analysis, time series analysis, and neural networks on large-scale datasets.
- Expertise in at least one specialized area of application, such as Computer Vision or Natural Language Processing (NLP). (NLP Expertise preferred)
- Strong programming skills in Python, including expertise in the data ecosystem (Numpy, Scipy, Pandas, etc.) or equivalent skills in languages like R, Java, Scala, or Julia, with a focus on producing production-quality code.
- Hands-on experience with popular ML frameworks like Scikit-Learn, PyTorch, or TensorFlow.
- Expertise with Generative AI and Large Language Models (LLMs) along with their implementation in real-life applications.
- Experience building end-to-end ML systems (MLOps).
- Experience in deploying code in web frameworks such as Flask, FastAPI or Django.
- Experience working in a cloud environment like AWS, Azure, or GCP for ML work.
- Good grasp of SQL/NoSQL databases and scripting skills, particularly within analytics platforms and data warehouses.
- Good grasp of software engineering concepts (SDLC, Version Control, CI/CD, Containerization, Scalability and so on), programming concepts, and tools/platforms like Git and Docker.
- Bonus: Experience with Big Data technologies such as Apache Spark, Kafka, Kinesis, and cloud-based ML platforms like AWS SageMaker or GCP ML Engine.
- Bonus: Experience with data visualization and dashboard tools like Tableau or Power BI.
Soft Skills:
- Highly motivated, self-driven, entrepreneurial mindset, and capable of solving complex analytical problems under high-pressure situations.
- Ability to work with cross-functional and cross-regional teams.
- Ability to lead a team of Data/AI professionals and work with senior management, technological experts, and the product team.
- Excellent written and verbal communication skills, comfortable with client communication.
- Good leadership skills - ability to motivate and mentor team members, ability to plan and make sound decisions, ability to negotiate tactfully with the client and team.
- Results-oriented, customer-focused with a passion for resolving tough technical and operational challenges.
- Possess excellent analytical and problem-solving abilities.
- Good documentation skills.
- Experienced with Agile methodologies like Scrum/Kanban
🔹 About the Role:
Title: Paid Marketing Executive — Mid-level (2+ years experience)
We’re looking for a data-driven Paid Marketing Executive to plan, execute, and optimize performance marketing campaigns across Google Ads, Meta Ads, and LinkedIn Ads.
The ideal candidate will have hands-on experience in lead generation, campaign analytics, and ROI optimization, with the ability to turn insights into impactful growth strategies
🎯 Key Responsibilities:
- Plan, launch, and optimize campaigns across Google, Meta, and LinkedIn Ads.
- Manage end-to-end campaign execution — targeting, ad setup, A/B testing, landing page optimization, and reporting.
- Drive lead generation and user acquisition while maintaining low CPA and high ROAS.
- Run remarketing, retargeting, and lookalike campaigns to improve conversion rates.
- Implement A/B testing for creatives, copies, and funnels to enhance performance.
- Utilize AI tools and automation to streamline campaign management and reporting.
- Monitor and analyze KPIs (CTR, CPC, CPA, ROAS, Conversion Rates) for ongoing performance improvement.
- Prepare and present reports using Google Analytics, GTM, Meta Ads Manager, and LinkedIn Campaign Manager.
🧩 Requirements:
- Minimum 2 years of experience in Performance Marketing / Digital Marketing (preferably B2B or Finance sector).
- Hands-on experience with Google Ads, Meta Ads, and LinkedIn Ads.
- Strong analytical mindset with experience in funnel optimization and A/B testing.
- Proficiency in analytics tools – Google Analytics, GTM, and Ads Managers.
- Ability to manage budgets effectively and deliver measurable ROI.
- Excellent communication and collaboration skills.
- Bachelor’s degree in Marketing, Business, or a related field.
- Certifications (Google Ads / Facebook Blueprint) are a plus.
📈 If you’re passionate about paid campaigns, analytics, and driving digital growth — this role is for you!
Join our dynamic marketing team and make an impact with data-backed decisions.
About Vijay Sales
Vijay Sales is one of India’s leading electronics retail brands with 160+ stores nationwide and a fast-growing digital presence. We are on a mission to build the most advanced data-driven retail intelligence ecosystem—using AI, predictive analytics, LLMs, and real-time automation to transform customer experience, supply chain, and omnichannel operations.
Role Overview
We are looking for a highly capable AI Engineer who is passionate about building production-grade AI systems, designing scalable ML architecture, and working with cutting-edge AI/ML tools. This role involves hands-on work with Databricks, SQL, PySpark, modern LLM/GenAI frameworks, and full lifecycle ML system design.
Key Responsibilities
Machine Learning & AI Development
- Build, train, and optimize ML models for forecasting, recommendation, personalization, churn prediction, inventory optimization, anomaly detection, and pricing intelligence.
- Develop GenAI solutions using modern LLM frameworks (e.g., LangChain, LlamaIndex, HuggingFace Transformers).
- Explore and implement RAG (Retrieval Augmented Generation) pipelines for product search, customer assistance, and support automation.
- Fine-tune LLMs on company-specific product and sales datasets (using QLoRA, PEFT, and Transformers).
- Develop scalable feature engineering pipelines leveraging Delta Lake and Databricks Feature Store.
Databricks / Data Engineering
- Build end-to-end ML workflows on Databricks using PySpark, MLflow, Unity Catalog, Delta Live Tables.
- Optimize Databricks clusters for cost, speed, and stability.
- Maintain reusable notebooks and parameterized pipelines for model ingestion, validation, and deployment.
- Use MLflow for tracking experiments, model registry, and lifecycle management.
Data Handling & SQL
- Write advanced SQL for multi-source data exploration, aggregation, and anomaly detection.
- Work on large, complex datasets from ERP, POS, CRM, Website, and Supply Chain systems.
- Automate ingestion of streaming and batch data into Databricks pipelines.
Deployment & MLOps
- Deploy ML models using REST APIs, Databricks Model Serving, Docker, or cloud-native endpoints.
- Build CI/CD pipelines for ML using GitHub Actions, Azure DevOps, or Databricks Workflows.
- Implement model monitoring for drift, accuracy decay, and real-time alerts.
- Maintain GPU/CPU environments for training workflows.
Must-Have Technical Skills
Core AI/ML
- Strong fundamentals in machine learning: regression, classification, time-series forecasting, clustering.
- Experience in deep learning using PyTorch or TensorFlow/Keras.
- Expertise in LLMs, embeddings, vector databases, and GenAI architecture.
- Hands-on experience with HuggingFace, embedding models, and RAG.
Databricks & Big Data
- Hands-on experience with Databricks (PySpark, SQL, Delta Lake, MLflow, Feature Store).
- Strong understanding of Spark execution, partitioning, and optimization.
Programming
- Strong proficiency in Python.
- Experience writing high-performance SQL with window functions, CTEs, and analytical queries.
- Knowledge of Git, CI/CD, REST APIs, and Docker.
MLOps & Production Engineering
- Experience deploying models to production and monitoring them.
- Familiarity with tools like MLflow, Weights & Biases, or SageMaker equivalents.
- Experience in building automated training pipelines and handling model drift/feedback loops.
Preferred Domain Experience
- Retail/e-commerce analytics
- Demand forecasting
- Inventory optimization
- Customer segmentation & personalization
- Price elasticity and competitive pricing
Job Title: Inside Sales Specialist
Location: Baner, Pune
Experience: 1–3 Years
Employment Type: Full-Time | WFO
About the Role:
We are looking for a Inside Sales Specialist with proven experience in identifying, nurturing, and converting qualified leads through outbound channels. The ideal candidate will have hands-on experience in email campaigns, LinkedIn outreach, and cold calling targeting international markets such as DACH, BENELUX, and the UK.
Key Responsibilities:
- Generate qualified B2B leads through cold calls, emails, and LinkedIn campaigns.
- Execute targeted email and LinkedIn outreach campaigns to build a strong sales pipeline.
- Manage follow-ups, schedule meetings, and support client acquisition activities.
- Maintain CRM data and provide weekly reports on lead quality and conversions.
- Collaborate with the sales and marketing team to improve campaign strategies and conversion rates.
- Ensure a minimum of 40 qualified leads per month are generated through outbound efforts.
- Identify new markets, decision-makers, and business opportunities across international regions.
Required Skills & Experience:
- 1–3 years of proven experience in B2B Lead Generation / Inside Sales.
- Strong communication and persuasion skills in English.
- Experience with international lead generation (especially DACH, BENELUX & UK markets).
- Proficiency in using LinkedIn Sales Navigator, CRM tools (HubSpot, Zoho, etc.), and email campaign tools.
- Ability to research, segment, and target the right prospects.
- Self-motivated, goal-oriented, and comfortable working with monthly lead targets.
Company Overview
McKinley Rice is not just a company; it's a dynamic community, the next evolutionary step in professional development. Spiritually, we're a hub where individuals and companies converge to unleash their full potential. Organizationally, we are a conglomerate composed of various entities, each contributing to the larger narrative of global excellence.
Redrob by McKinley Rice: Redefining Prospecting in the Modern Sales Era
Backed by a $40 million Series A funding from leading Korean & US VCs, Redrob is building the next frontier in global outbound sales. We’re not just another database—we’re a platform designed to eliminate the chaos of traditional prospecting. In a world where sales leaders chase meetings and deals through outdated CRMs, fragmented tools, and costly lead-gen platforms, Redrob provides a unified solution that brings everything under one roof.
Inspired by the breakthroughs of Salesforce, LinkedIn, and HubSpot, we’re creating a future where anyone, not just enterprise giants, can access real-time, high-quality data on 700 M+ decision-makers, all in just a few clicks.
At Redrob, we believe the way businesses find and engage prospects is broken. Sales teams deserve better than recycled data, clunky workflows, and opaque credit-based systems. That’s why we’ve built a seamless engine for:
- Precision prospecting
- Intent-based targeting
- Data enrichment from 16+ premium sources
- AI-driven workflows to book more meetings, faster
We’re not just streamlining outbound—we’re making it smarter, scalable, and accessible. Whether you’re an ambitious startup or a scaled SaaS company, Redrob is your growth copilot for unlocking warm conversations with the right people, globally.
EXPERIENCE
Duties you'll be entrusted with:
- Develop and execute scalable APIs and applications using the Node.js or Nest.js framework
- Writing efficient, reusable, testable, and scalable code.
- Understanding, analyzing, and implementing – Business needs, feature modification requests, and conversion into software components
- Integration of user-oriented elements into different applications, data storage solutions
- Developing – Backend components to enhance performance and receptiveness, server-side logic, and platform, statistical learning models, highly responsive web applications
- Designing and implementing – High availability and low latency applications, data protection and security features
- Performance tuning and automation of applications and enhancing the functionalities of current software systems.
- Keeping abreast with the latest technology and trends.
Expectations from you:
Basic Requirements
- Minimum qualification: Bachelor’s degree or more in Computer Science, Software Engineering, Artificial Intelligence, or a related field.
- Experience with Cloud platforms (AWS, Azure, GCP).
- Strong understanding of monitoring, logging, and observability practices.
- Experience with event-driven architectures (e.g., Kafka, RabbitMQ).
- Expertise in designing, implementing, and optimizing Elasticsearch.
- Work with modern tools including Jira, Slack, GitHub, Google Docs, etc.
- Expertise in Event driven architecture.
- Experience in Integrating Generative AI APIs.
- Working experience in high user concurrency.
- Experience in scaled databases for handling millions of records - indexing, retrieval, etc.,
Technical Skills
- Demonstrable experience in web application development with expertise in Node.js or Nest.js.
- Knowledge of database technologies and agile development methodologies.
- Experience working with databases, such as MySQL or MongoDB.
- Familiarity with web development frameworks, such as Express.js.
- Understanding of microservices architecture and DevOps principles.
- Well-versed with AWS and serverless architecture.
Soft Skills
- A quick and critical thinker with the ability to come up with a number of ideas about a topic and bring fresh and innovative ideas to the table to enhance the visual impact of our content.
- Potential to apply innovative and exciting ideas, concepts, and technologies.
- Stay up-to-date with the latest design trends, animation techniques, and software advancements.
- Multi-tasking and time-management skills, with the ability to prioritize tasks.
THRIVE
Some of the extensive benefits of being part of our team:
- We offer skill enhancement and educational reimbursement opportunities to help you further develop your expertise.
- The Member Reward Program provides an opportunity for you to earn up to INR 85,000 as an annual Performance Bonus.
- The McKinley Cares Program has a wide range of benefits:
- The wellness program covers sessions for mental wellness, and fitness and offers health insurance.
- In-house benefits have a referral bonus window and sponsored social functions.
- An Expanded Leave Basket including paid Maternity and Paternity Leaves and rejuvenation Leaves apart from the regular 20 leaves per annum.
- Our Family Support benefits not only include maternity and paternity leaves but also extend to provide childcare benefits.
- In addition to the retention bonus, our McKinley Retention Benefits program also includes a Leave Travel Allowance program.
- We also offer an exclusive McKinley Loan Program designed to assist our employees during challenging times and alleviate financial burdens.
We are looking for a Senior Lead Designer at Hummingbird, you will guide and mentor a team of Graphic and UI/UX designers, ensuring the delivery of high-quality, impactful designs that reflect our brand and enhance user experience.
- Design Leadership: Lead, mentor, and guide the design team while maintaining a high standard of creative excellence.
- Graphic Design: Produce visually compelling graphics and illustrations aligned with brand guidelines across various platforms.
- UI/UX Design: Create intuitive, user-centric interfaces and seamless digital experiences.
- Tool Proficiency: Work with tools such as Figma and Adobe Illustrator to design wireframes, prototypes, and final visuals.
- Communication: Clearly present design concepts, decisions, and feedback to team members and stakeholders.
- Team Motivation: Inspire and encourage the design team to foster a collaborative and innovative environment.
- Project Management: Manage multiple design projects to ensure timely delivery and alignment with business goals.
- Design Best Practices: Advocate for and educate the team on modern design principles, usability, and industry trends.
Qualifications
- 6 to 12 years of experience in Graphic Design and UI/UX Design, preferably within the e-commerce domain.
- Proven experience leading and managing a design team.
- Strong portfolio showcasing successful graphic and UI/UX work.
- Excellent communication, leadership, and collaboration skills.
- Ability to adapt to evolving project requirements and thrive in a fast-paced environment.
Design, develop, implement, and unit test enterprise-grade applications using Angular 8+, C#, .NET Core, and ASP.NET.
Participate in design discussions, code reviews, and sprint planning as part of an agile team.
Business Development Executive (Floor and Wall tiles)
About The Company:
A leading MNC in tiles and bathware, with operations in over 40 countries and a group turnover of more than USD 1 billion. The company is strengthening its footprint in India by expanding its sales and business development team.
Job Title: Business Development Executive
Locations: Delhi, Lucknow, and Pune
Responsibilities:
- Achieve regional client activation and sales targets through proactive territory management.
- Secure product specifications, approvals, and conversions by providing strategic sales guidance to clients and stakeholders.
- Meet sales targets across all product lines by planning and executing targeted sales initiatives.
- Collaborate with the sales team to retain and grow the customer base, building strong relationships with key accounts and identifying new business opportunities.
- Develop and maintain relationships with architects, builders, interior designers, influencers, government authorities, and corporate customers to drive project specifications and orders.
- Leverage a strong stakeholder network to drive sustainable revenue growth across assigned markets.
Benefits and Perks: Rs. 8 – 10 LPA (including 10% variable pay).
Qualifications:
- Industry Experience: Minimum 2 years in sales and business development, specifically in tiles and bathware.
- Education: Bachelor's or Master's Degree
- Languages and Communication: Fluency in English and the relevant regional language is mandatory.
About the Company:
Verinite is a global technology consulting and services company laser-focused on the banking & financial services sector, especially in cards, payments, lending, trade, and treasury
They partner with banks, fintechs, payment processors, and other financial institutions to modernize their systems, improve operational resilience, and accelerate digital transformation. Their services include consulting, digital strategy, data, application modernization, quality engineering (testing), cloud & infrastructure, and application maintenance.
Skill – Authorization, Clearing and Settlement
1. Individual should have worked on scheme (Visa, Amex, Discover, Rupay & Mastercard both on authorization or clearing section.
2. Should be able to read scheme specifications and create business requirement/mapping for authorization and Clearing
3. Should have Hands on experience in implementing scheme related changes
4. Should be able to validate the and certify the change post development based on the mapping created
5. Should be able to work with Dev team on explaining and guiding on time-to-time basis.
6. Able to communicate with various teams & senior stakeholders
7. Go getter and great googler
8. Schemes – VISA/MC/AMEX/JCB/CUP/Mercury – Discover and Diners, CBUAE, Jaywan ( Local Scheme from UAE)
9.Experience with Issuing side is plus (good to have).
Responsibilities -
- Build amazing web applications using ReactJS, HTML, CSS, JS and more
- Work with a cross-shore development team
- Influence and collaborate to create an amazing online experience
- Participate in agile sprints with cross-functional teams including planning, daily standups, backlog grooming sessions and reviews
- Analyze production defects, troubleshoot systems, identify root cause, and implement fixes
- Work with third party vendors to develop software and/or integrate their software into our products
- Perform other duties and/or special projects as assigned
Desired characteristics –
- Experience building responsive web applications using ReactJS, HTML, CSS, JavaScript, and TypeScript
- Experience working on an agile development team
- Experience with building React component libraries and dependency management tools like NPM
- Experience with continuous integration environments like Jenkins
- Experience building and deploying applications
- Experience with unit testing frameworks, UI test cases in a Test-driven development (TDD) environment
- Working knowledge of implementing Accessibility (ADA) and Analytics requirements
Qualifications/Requirements:
- Bachelor’s degree in computer science or related degree OR, in lieu of degree, High School Diploma/GED 2 years of professional web application development experience
Role: Interior Designer / Architect - male
Experience: 1+ years
Skills Required: Proficiency in AutoCAD 2D, SketchUp, and V-Ray
Location: Candidate must be based in Pune
Qualification: Degree in Interior Design or B.Arch
Tejomaya Designs is renowned for its brilliance and splendor in every architectural and interior design project. We are driven by a relentless passion for contemporary art and design, aiming to create luxurious, stylish, and grand visions. Our work reflects effulgence and finesse, inspiring stunning designs like never before.
Role Description
This is a full-time on-site role based in Pune for an Interior Designer. The Interior Designer will be responsible for creating and executing interior design concepts, including space planning, collaborating with architects, and preparing construction drawings. Day-to-day tasks will involve coordinating with clients, selecting FF&E (Furniture, Fixtures, and Equipment), and overseeing project execution to ensure design integrity and quality.
Qualifications
Expertise in Interior Design and Space Planning
Experience with Architecture and preparing Construction Drawings
Knowledge of FF&E (Furniture, Fixtures, and Equipment) selection
Strong project management and coordination skills
Excellent communication and client interaction skills
Ability to work on-site in Pune
Bachelor's degree in Interior Design, Architecture, or related field
We are seeking a highly skilled Senior Data Engineer with expertise in Databricks, Python, Scala, Azure Synapse, and Azure Data Factory to join our data engineering team. The team is responsible for ingesting data from multiple sources, making it accessible to internal stakeholders, and enabling seamless data exchange across internal and external systems.
You will play a key role in enhancing and scaling our Enterprise Data Platform (EDP) hosted on Azure and built using modern technologies such as Databricks, Synapse, Azure Data Factory (ADF), ADLS Gen2, Azure DevOps, and CI/CD pipelines.
Responsibilities
- Design, develop, optimize, and maintain scalable data architectures and pipelines aligned with ETL principles and business goals.
- Collaborate across teams to build simple, functional, and scalable data solutions.
- Troubleshoot and resolve complex data issues to support business insights and organizational objectives.
- Build and maintain data products to support company-wide usage.
- Advise, mentor, and coach data and analytics professionals on standards and best practices.
- Promote reusability, scalability, operational efficiency, and knowledge-sharing within the team.
- Develop comprehensive documentation for data engineering standards, processes, and capabilities.
- Participate in design and code reviews.
- Partner with business analysts and solution architects on enterprise-level technical architectures.
- Write high-quality, efficient, and maintainable code.
Technical Qualifications
- 5–8 years of progressive data engineering experience.
- Strong expertise in Databricks, Python, Scala, and Microsoft Azure services including Synapse & Azure Data Factory (ADF).
- Hands-on experience with data pipelines across multiple source & target systems (Databricks, Synapse, SQL Server, Data Lake, SQL/NoSQL sources, and file-based systems).
- Experience with design patterns, code refactoring, CI/CD, and building scalable data applications.
- Experience developing batch ETL pipelines; real-time streaming experience is a plus.
- Solid understanding of data warehousing, ETL, dimensional modeling, data governance, and handling both structured and unstructured data.
- Deep understanding of Synapse and SQL Server, including T-SQL and stored procedures.
- Proven experience working effectively with cross-functional teams in dynamic environments.
- Experience extracting, processing, and analyzing large / complex datasets.
- Strong background in root cause analysis for data and process issues.
- Advanced SQL proficiency and working knowledge of a variety of database technologies.
- Knowledge of Boomi is an added advantage.
Core Skills & Competencies
- Excellent analytical and problem-solving abilities.
- Strong communication and cross-team collaboration skills.
- Self-driven with the ability to make decisions independently.
- Innovative mindset and passion for building quality data solutions.
- Ability to understand operational systems, identify gaps, and propose improvements.
- Experience with large-scale data ingestion and engineering.
- Knowledge of CI/CD pipelines (preferred).
- Understanding of Python and parallel processing frameworks (MapReduce, Spark, Scala).
- Familiarity with Agile development methodologies.
Education
- Bachelor’s degree in Computer Science, Information Technology, MIS, or an equivalent field.
As a Data Engineer, you will be an integral part of our team, working on data pipelines, data warehousing, and data integration for various analytics and AI use cases. You will collaborate closely with Delivery Managers, ML Engineers and other stakeholders to ensure seamless data flow and accessibility. Your expertise will be crucial in enabling data-driven decision-making for our clients. To thrive in this role, you need to be a quick learner, get excited about innovation and be on the constant lookout to master new technologies as they come up in the Data, AI & Cloud teams.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL processes to support downstream analytics and AI applications.
- Collaborate with ML Engineers to integrate data solutions into machine learning models and workflows.
- Work closely with clients to understand their data requirements and deliver tailored data solutions.
- Ensure data quality, integrity, and security across all projects.
- Optimize and manage data storage solutions in cloud environments (AWS, Azure, GCP).
- Utilize Databricks for data processing and analytics tasks, leveraging its capabilities to enhance data workflows.
- Monitor the performance of data pipelines, identify bottlenecks or failures, and implement improvements to enhance efficiency and reliability.
- Implement best practices for data engineering, including documentation, testing, and version control.
- Troubleshoot and resolve data-related issues in a timely manner.
Qualifications
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
- 3 to 5 years of experience as a Data Engineer or in a similar role.
- Strong proficiency in SQL, Python, and other relevant programming languages.
- Hands-on experience with Databricks and its ecosystem.
- Familiarity with major cloud environments (AWS, Azure, GCP) and their data services.
- Experience with data warehousing solutions like Snowflake, Redshift, or BigQuery.
- Comfortable working with a variety of SQL, NoSQL and graph databases like PostgreSQL and MongoDB;
- Knowledge of data integration tools.
- Understanding of data modelling, data architecture, and database design.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
Highly Desirable Skills
- Experience with real-time data processing frameworks (e.g., Apache Kafka, Spark Streaming).
- Knowledge of data visualisation tools (e.g., Tableau, Power BI).
- Familiarity with machine learning concepts and frameworks.
- Experience working in a client-facing role.
Experience Required: 4–8 years in social media management (preferably F&B, hospitality, or lifestyle brands)
Qualification: Graduate
About the Company
A leading player in the F&B and lifestyle space, the company is known for delivering memorable dining and lifestyle experiences. With a strong focus on creativity, innovation, and customer engagement, the brand is expanding its digital presence and looking for a dynamic professional to lead its social media voice.
Role Overview
We are seeking a highly creative and strategic Social Media Executive to drive the digital presence of two lifestyle brands. This role goes beyond posting — it requires content innovation, trend spotting, influencer collaborations, and campaign execution that boost visibility, engagement, and conversions.
Key Responsibilities
Social Media Strategy & Content
- Build and manage monthly content calendars.
- Develop engaging, trend-driven content (posts, reels, stories, videos, carousels).
- Ensure brand tone and aesthetics remain consistent across platforms.
Community Management
- Monitor and manage DMs, comments, and customer interactions.
- Grow and engage online communities meaningfully.
Campaigns, Influencers & Collaborations
- Plan and execute influencer tie-ups, contests, and giveaways.
- Drive engagement campaigns aligned with marketing initiatives.
- Coordinate with agencies, creators, and photographers.
Activations & Events (Digital Integration)
- Amplify on-ground activations, pop-ups, and events digitally.
- Create real-time, interactive content around brand experiences.
Analytics & Reporting
- Track KPIs (reach, engagement, growth, conversions).
- Share regular reports with insights and recommendations.
Budget & Vendor Management
- Manage spends for influencer campaigns, paid promotions, and content production.
- Coordinate with vendors and in-house teams for shoots, editing, and production.
Collaboration
- Work closely with the Head of Marketing and Brand Marketing Executive to align social content with overall brand strategy.
Requirements
- 4–8 years of proven experience handling social media for F&B, lifestyle, or consumer brands.
- Expertise in Instagram, Facebook, LinkedIn (knowledge of emerging platforms is a plus).
- Strong understanding of trends, analytics, and influencer ecosystems.
- Proficiency in Canva, Photoshop, and short-form video editing tools.
- Creative thinker with excellent writing, visual, and storytelling skills.
- Ability to manage two brands simultaneously with consistency.
If you're passionate about digital storytelling and want to shape the online voice of vibrant lifestyle brands, we'd love to hear from you.
Job Description:
We are looking for a skilled Backend Developer with 2–5 years of experience in software development, specializing in Python and/or Golang. If you have strong programming skills, enjoy solving problems, and want to work on secure and scalable systems, we'd love to hear from you!
Location - Pune, Baner.
Interview Rounds - In Office
Key Responsibilities:
Design, build, and maintain efficient, reusable, and reliable backend services using Python and/or Golang
Develop and maintain clean and scalable code following best practices
Apply Object-Oriented Programming (OOP) concepts in real-world development
Collaborate with front-end developers, QA, and other team members to deliver high-quality features
Debug, optimize, and improve existing systems and codebase
Participate in code reviews and team discussions
Work in an Agile/Scrum development environment
Required Skills: Strong experience in Python or Golang (working knowledge of both is a plus)
Good understanding of OOP principles
Familiarity with RESTful APIs and back-end frameworks
Experience with databases (SQL or NoSQL)
Excellent problem-solving and debugging skills
Strong communication and teamwork abilities
Good to Have:
Prior experience in the security industry
Familiarity with cloud platforms like AWS, Azure, or GCP
Knowledge of Docker, Kubernetes, or CI/CD tools
Must have Strong SQL skills (queries, optimization, procedures, triggers)
Must have Advanced Excel skills
Should have 3+ years of relevant experience
Should have Reporting + dashboard creation experience
Should have Database development & maintenance experience
Must have Strong communication for client interactions
Should have Ability to work independently
Willingness to work from client locations
Job description
T📍 Location: Pune
🏢 Mode: Hybrid
📄 Employment Type: Permanent
Overview:
Are you a JavaScript and TypeScript enthusiast passionate about building user-friendly and visually appealing web applications? If you have 5+ years of React experience (preferably in SaaS), we’d love to have you join Azodha, a digital healthcare startup!
As a React Software Development Engineer, you will play a key role in developing and maintaining our web-based healthcare platform. You'll be responsible for building and optimizing React-based user interfaces, managing our frontend monorepo, and collaborating with a dynamic team to drive continuous improvement. If you're self-motivated, detail-oriented, and thrive in a fast-paced startup environment, we encourage you to apply!
Responsibilities:
- Develop and maintain our web-based healthcare platform using React, JavaScript, and TypeScript.
- Build and optimize reusable components within a frontend monorepo.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Continuously discover, evaluate, and integrate new technologies to enhance development efficiency.
- Write clean, maintainable, and efficient code.
- Stay up-to-date with industry best practices and emerging technologies.
Qualifications:
- 5+ years of experience with React (SaaS experience preferred).
- Strong understanding of JavaScript and TypeScript.
- Experience with HTML, CSS, and modern web development frameworks.
- Experience working in a collaborative team environment.
- Strong problem-solving and communication skills.
- Bachelor’s degree in Computer Science or a related field (preferred).
- Experience with monorepos, Storybook, and Playwright is a plus.
Soft Skills:
- Bias for action and initiative.
- Strong design sense and attention to detail.
- Empathy for users and a product-focused mindset.
- Excellent communication and teamwork skills.
About Azodha:
Azodha (formerly Hlth Dev) is a digital healthcare startup focused on accelerating the development of digital healthcare and therapeutic applications. Join us in making a meaningful impact in the healthcare industry!
📌 Industry: Technology, Information & Internet
📌 Employment Type: Full-time
Role: Senior Backend Engineer(Nodes.js+Typescript+Postgres)
Location: Pune
Type: Full-Time
Who We Are:
After a highly successful launch, Azodha is ready to take its next major step. We are seeking a passionate and experienced Senior Backend Engineer to build and enhance a disruptive healthcare product. This is a unique opportunity to get in on the ground floor of a fast-growing startup and play a pivotal role in shaping both the product and the team.
If you are an experienced backend engineer who thrives in an agile startup environment and has a strong technical background, we want to hear from you!
About The Role:
As a Senior Backend Engineer at Azodha, you’ll play a key role in architecting, solutioning and driving development of our AI led interoperable digital enablement platform.You will work closely with the founder/CEO to refine the product vision, drive product innovation, delivery and grow with a strong technical team.
What You’ll Do:
* Technical Excellence: Design, develop, and scale backend services using Node.js and TypeScript, including REST and GraphQL APIs. Ensure systems are scalable, secure, and high-performing.
* Data Management and Integrity: Work with Prisma or TypeORM, and relational databases like PostgreSQL and MySQL
* Continuous Improvement: Stay updated with the latest trends in backend development, incorporating new technologies where appropriate. Drive innovation and efficiency within the team
* Utilize ORMs such as Prisma or TypeORM to interact with database and ensure data integrity.
* Follow Agile sprint methodology for development.
* Conduct code reviews to maintain code quality and adherence to best practices.
* Optimize API performance for optimal user experiences.
* Participate in the entire development lifecycle, from initial planning , design and maintenance
* Troubleshoot and debug issues to ensure system stability.
* Collaborate with QA teams to ensure high quality releases.
* Mentor and provide guidance to junior developers, offering technical expertise and constructive feedback.
Requirements
* Bachelor's degree in Computer Science, software Engineering, or a related field.
* 5+ years of hands-on experience in backend development using Node.js and TypeScript.
* Experience working on Postgres or My SQL.
* Proficiency in TypeScript and its application in Node.js
* Experience with ORM such as Prisma or TypeORM.
* Familiarity with Agile development methodologies.
* Strong analytical and problem solving skills.
* Ability to work independently and in a team oriented, fast-paced environment.
* Excellent written and oral communication skills.
* Self motivated and proactive attitude.
Preferred:
* Experience with other backend technologies and languages.
* Familiarity with continuous integration and deployment process.
* Contributions to open-source projects related to backend development.
Note: please don't apply if you're profile if you're primary database is postgres SQL.
Join our team of talented engineers and be part of building cutting edge backend systems that drive our applications. As a Senior Backend Engineer, you'll have the opportunity to shape the future of our backend infrastructure and contribute company's success. If you are passionate about backend development and meet the above requirements, we encourage you to apply and become valued member of our team at Azodha.
The ideal candidate will play a key role in designing, implementing, and maintaining cloud infrastructure and CI/CD pipelines to support scalable, secure, and high-performance data and analytics solutions. This role requires strong expertise in Azure, Databricks, and cloud-native DevOps practices.
Key Responsibilities:
1. Cloud Infrastructure Design & Management
- Architect, deploy, and manage scalable and secure cloud infrastructure on Microsoft Azure.
- Implement best practices across resource groups, virtual networks, storage accounts, etc.
- Ensure cost optimization, high availability, and disaster recovery for business-critical systems.
2. Databricks Platform Management
- Set up, configure, and maintain Databricks workspaces for data engineering, ML, and analytics workloads.
- Automate cluster management, job scheduling, and performance monitoring.
- Integrate Databricks seamlessly with Azure data and analytics services.
3. CI/CD Pipeline Development
- Design and implement CI/CD pipelines for infrastructure, applications, and data workflows.
- Work with Azure DevOps / GitHub Actions (or similar) for automated testing and deployments.
- Drive continuous delivery, versioning, and monitoring best practices.
4. Monitoring & Incident Management
- Implement monitoring and alerting with Dynatrace, Azure Monitor, Log Analytics, and Databricks metrics.
- Diagnose and resolve issues to ensure minimal downtime and smooth operations.
5. Security & Compliance
- Enforce IAM, encryption, network security, and secure development practices.
- Ensure compliance with organizational and regulatory cloud standards.
6. Collaboration & Documentation
- Work closely with data engineers, software developers, architects, and business teams to align infrastructure with business goals.
- Maintain thorough documentation for infrastructure, processes, and configurations.
Required Qualifications
- Bachelor’s degree in Computer Science, Engineering, or a related field.
Must-Have Experience
- 6+ years in DevOps / Cloud Engineering roles.
- Proven expertise in:
- Microsoft Azure (Azure Data Lake, Databricks, ADF, Azure Functions, AKS, Azure AD)
- Databricks for data engineering / analytics workloads.
- Strong experience applying DevOps practices to cloud-based data and analytics platforms.
Technical Skills
- Infrastructure as Code (Terraform, ARM, Bicep).
- Scripting (Python / Bash).
- Containerization & orchestration (Docker, Kubernetes).
- CI/CD & version control (Git, Azure DevOps, GitHub Actions).
Soft Skills
- Strong analytical and problem-solving mindset.
- Excellent communication and collaboration abilities.
- Ability to operate in cross-functional and fast-paced environments.
Position: Sales Manager – Identity & Access Management (IGA/IDAM Solutions)
Mode - Permanent
Base Location: Mumbai/Pune
Experience: 8–10 years
Domain Focus: – Identity & Access Management (IGA/IDAM Solutions), BFSI (Banking, Financial Services & Insurance), IT and Retail Clients
Desired Skills & Competencies
- Proven track record in selling Identity & Access Management (IAM/IGA/IDAM) or Identity Security solutions.
- Excellent communication, negotiation, and presentation skills with the ability to influence decision-makers.
- Robust network with CISOs, CIOs, CTOs, Risk & Compliance Heads, and IT Security Leaders.
- Ability to independently drive the complete sales cycle—from prospecting, opportunity identification, solution positioning, proposal building, to closure.
- Understanding of regulatory and compliance requirements in BFSI, IT, Identity Security and Retail domain as they relate to Identity & Access Management.
- Experience in coordinating with cross-functional teams (presales, solution architects, delivery) to ensure successful outcomes.
- Consultative selling mindset with strong business acumen and ability to map IAM solutions to client’s security and compliance challenges.
Detailed Job Description
We are looking for a dynamic and results-driven Sales Manager to drive growth for our flagship Identity Confluence (IGA/IDAM) solution. The role combines strategic sales, account management, and subject-matter expertise in Identity & Access Management.
Key Responsibilities:
- Drive new business development and consistently achieve or exceed sales targets for Identity Confluence.
- Develop and nurture long-term relationships with CXO-level stakeholders (CISO, CIO, CTO, CRO, Risk/Compliance Heads, IT Director).
- Partner closely with pre-sales and solutioning teams to deliver tailored demos, solution workshops, and compelling proposals.
- Build and maintain a healthy pipeline of opportunities through proactive prospecting, networking, and leveraging industry connects.
- Stay updated on IAM/IGA trends and competitor offerings to position Identity Confluence as the preferred choice.
- Provide market intelligence and client feedback to product and marketing teams to shape future solution roadmaps.
- Represent the organization in industry events, conferences, and partner forums to enhance visibility and generate leads.
What We Offer
- Opportunity to represent a premium enterprise-grade IGA solution – Identity Confluence.
- Competitive fixed compensation plus attractive performance-based incentives.
- Exposure to large enterprises and CXO-level leadership engagements.
- Career growth opportunities in a rapidly growing organization.
- Collaborative and entrepreneurial work culture with the chance to make a significant impact.
Hiring: Azure Data Engineer
⭐ Experience: 2+ Years
📍 Location: Pune, Bhopal, Jaipur, Gurgaon, Bangalore
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
Passport: Mandatory & Valid
(Only immediate joiners & candidates serving notice period)
Mandatory Skills:
Azure Synapse, Azure Databricks, Azure Data Factory (ADF), SQL, Delta Lake, ADLS, ETL/ELT,Pyspark .
Responsibilities:
- Build and maintain data pipelines using ADF, Databricks, and Synapse.
- Develop ETL/ELT workflows and optimize SQL queries.
- Implement Delta Lake for scalable lakehouse architecture.
- Create Synapse data models and Spark/Databricks notebooks.
- Ensure data quality, performance, and security.
- Collaborate with cross-functional teams on data requirements.
Nice to Have:
Azure DevOps, Python, Streaming (Event Hub/Kafka), Power BI, Azure certifications (DP-203).
Leapfrog is on a mission to be a role model technology company. Since 2010, we have relentlessly worked on crafting better digital products with our team of superior engineers. We’re a full-stack company specializing in SaaS products and have served over 100 clients with our mastery of emerging technologies.
We’re thinkers and doers, creatives and coders, makers and builders— but most importantly, we are trusted partners with world-class engineers. Hundreds of companies in Boston, Seattle, Silicon Valley, and San Francisco choose us to gain speed, agility, quality, and stability, giving them an edge over their competitors.
We are seeking a highly skilled Salesforce Developer to enhance our customer engagement capabilities by upgrading our Legacy Chat to Enhanced Chat. The ideal candidate will have hands-on experience with Salesforce Service Cloud and Sales Cloud, coupled with a strong understanding of Omni-Channel and Live Agent functionalities.
This role requires proven expertise in Apex, Lightning Web Components (LWC), JavaScript, HTML, SOQL, and SOSL, with the ability to design and implement scalable, high-quality Salesforce solutions that drive customer success.
Essential Duties & Responsibilities
- Focus on delivering high-quality, functional solutions on the Salesforce.com platform using Apex, Lightning Web Components (LWC), SOAP, and REST APIs.
- Lead the migration from Legacy Chat to Enhanced Chat, ensuring a seamless transition for users and customers within Service Cloud and Sales Cloud.
- Design and implement Omni-Channel and Omni-Flow configurations to optimize customer service workflows and routing.
- Perform deployment, testing, and documentation of Salesforce features, enhancements, and integrations.
- Collaborate closely with product owners, engineering teams, and business stakeholders to define, clarify, and implement both functional and non-functional requirements for new and existing backlog items.
- Train and support end-users on implemented Salesforce features and planned solutions to ensure adoption and efficiency.
- Investigate, scope, and plan the implementation of assigned epics and backlog items, leveraging deep Salesforce platform expertise to model, document, and justify scalable, maintainable solutions.
Desired Outcomes
- Lead the migration from Legacy Chat to Enhanced Chat within Service Cloud and Sales Cloud, ensuring a seamless, scalable, and user-friendly transition for both customers and internal teams.
- Design, build, and deploy Enhanced Chat configurations, including Omni-Channel and Omni-Flow setups, to optimize response times, routing efficiency, and overall customer engagement.
- Execute deployment, testing, and documentation of Salesforce features, enhancements, and integrations, maintaining high standards of quality, performance, and compliance with best practices.
About you
- Minimum 5 years of hands-on experience with coding on the Salesforce Platform using Apex, Visualforce, Lightning / Aura Components, Javascript, HTML, REST/SOAP API etc.
- Minimum 2 years of hands-on experience creating Flows
- Minimum 2 years of experience with Omnichannel and Live Agent Chat
- Minimum 2 years of experience with Sales, Service Cloud
Required Education / Certificates / Experience
- Bachelor of Science or equivalent preferably in Computer Science / Computer Engineering / Electrical Engineering
- Salesforce Platform Developer I certification
Strong Lead – User Research & Analyst profile (behavioural/user/product/ux analytics)
Mandatory (Experience 1): Must have 10+ years of experience in Behavioral Data Analytics, User Research, or Product Insights, driving data-informed decision-making for B2C digital products (web and app).
Mandatory (Experience 2): Must have 6 months+ experience in analyzing user journeys, clickstream, and behavioral data using tools such as Google Analytics, Mixpanel, CleverTap, Firebase, or Amplitude.
Mandatory (Experience 3): Experience in leading cross-functional user research and analytics initiatives in collaboration with Product, Design, Engineering, and Business teams to translate behavioral insights into actionable strategies.
Mandatory (Skills 1): Strong expertise in A/B testing and experimentation, including hypothesis design, execution, statistical validation, and impact interpretation.
Mandatory (Skills 2): Ability to identify behavioral patterns, funnel drop-offs, engagement trends, and user journey anomalies using large datasets and mixed-method analysis.
Mandatory (Skills 3): Hands-on proficiency in SQL, Excel, and data visualization/storytelling tools such as Tableau, Power BI, or Looker for executive reporting and dashboard creation.
Mandatory (Skills 4): Deep understanding of UX principles, customer journey mapping, and product experience design, with experience integrating qualitative and quantitative insights.
Mandatory (Company): B2C product organizations (fintech, e-commerce, edtech, or consumer platforms) with large-scale user datasets and analytics maturity.
Mandatory (Note): Don’t want data analysts; looking for strategic behavioral insight leaders or research-driven analytics professionals focused on user behavior and product decision-making.
Role - RSA Archer Technial Specialist
Location preferred - Bangalore + key metro
Exp Band - 10 +
JD
Experience in application development using the Archer platform
- Proficiency in Archer configuration, including custom fields, rules, and workflows
- Strong understanding of GRC concepts and the business context of Archer solutions
- Experience with web technologies including HTML, JavaScript, and CSS
- Familiarity with integration techniques and APIs
- Excellent problem-solving and analytical skills
- Able to work independently and collaboratively in a fast-paced environment
- Strong communication skills to interact with various stakeholders effectively
Overall 4 Years
Module pool programming
BDC (Batch Data Communication)
Interactive reports/forms
OOP (Object-Oriented Programming) concepts
In-depth knowledge of Function Modules (FM), RFC, BAPI, and Web Services.
Strong experience with Smart Forms and Adobe Forms.
Extensive experience with BADI and User Exits for customizing SAP applications.
Proficiency in ABAP Web Dynpro programming for developing web-based applications.
Hands-on experience in creating and maintaining ABAP Dictionary objects (tables, views, domains, data elements, etc.).
Ability to effectively debug programs, identify bug fixes, and optimize performance bottlenecks.
Ability to understand functional/technical specifications and translate them into high-quality ABAP code.
Experience in preparing technical documentation as per established templates and standards.
Proven experience working on medium and large-scale SAP projects.
Desirable Skills:
Exposure to SAP HANA and ABAP on HANA for high-performance computing.
Familiarity with SAP Fiori and UI5 for modern web-based applications.
Understanding of SAP integration technologies, including IDocs, BAPI, and RFC.
Knowledge of Agile methodologies or Scrum project management frameworks.
Only candidates currently in Pune or Open to relocating to Pune, please apply:
Job Description:
We are looking for a proactive, detail-oriented Bid Support Executive to join our business development team. This role involves identifying tender opportunities, coordinating across departments to prepare bid documents, and ensuring timely submissions on various e-procurement platforms. The ideal candidate will be highly organized, capable of multitasking, and possess strong communication skills.
Key Responsibilities:
- You will track and identify relevant tenders (Govt./PSU/Private portals, GeM).
- You will assist in preparing and formatting technical and financial bid documents.
- You will coordinate with internal departments (Tech, Finance, HR, Legal) to gather required inputs.
- You will assist in submitting bids on e-procurement portals (GeM, CPPP, state portals) within timelines.
- You will maintain and organize the standard document repository, declarations, and past submissions.
- You will handle post-submission queries and documentation requests.
- You should be good at handling administrative tasks and coordinating across departments for timely bid compilation and approvals.
Candidate Profile:
Required Qualifications:
- 2–3 years of experience in bid/tender support (preferably in IT, e-Gov, or HealthTech sectors).
- Proficiency in MS Word, Excel, and PowerPoint.
- Familiarity with e-procurement portals and tender procedures.
- Strong communication and coordination skills.
- Good at multitasking and managing administrative follow-ups.
- High attention to detail and time management.
Desired Qualifications:
- Experience in preparing and submitting tenders or proposals
- Understanding of government bidding and RFP process
- Basic knowledge of healthcare or IT solutions (optional)
- Familiar with tender portals like GeM or CPPP
- Ability to prepare standard documents
- Comfortable using tools like Google Drive, Excel, or task trackers.
Experience:
2 to 3 Years
Educational Background:
Graduate in any discipline (Business/IT preferred)
Position: Sales Development Representative (International Voice Process)
Job Responsibilities
● Making multiple outbound calls to assigned B2B prospects. Develop sales opportunities by researching the prospective company, using influencing and relationship-building skills, and providing information on the client's product/value proposition.
● Ability to understand the key objections from prospects, clarify their concerns & use product knowledge & vendor-led training to alleviate these concerns & move the sales cycle forward. Persistently follow up with the prospect in a clear & timely manner to ensure positive outcomes.
● Understand customer campaigns, and their products/services and appropriately communicate customer brand identity to prospects. Provide detailed and concise feedback to the Voice Operations leads on the outcomes (conversions/rejects / not interested etc.).
● Undertakepre-sales outreach processes such as AG, HQL, SQL, BANT, marketing, and sales lead qualification.
Requirements:
● Minimum2yearsexperience in B2B sales, ideally selling to technology stakeholders, and senior stakeholders including C-suite within the enterprise and SMB organizations with a solid track record of lead conversions via outbound calling and emails.
● Been part of marketing and sales lead generation campaign teams focusing on SQL, MQL, BANT, AG, etc.
● Excellent verbal communication and convincing skills; should be able to think on their feet and provide effective rebuttals/responses to prospects on the calls.
● Strong track record of meeting their targets for SQL / MQL / BANT/ AG campaigns.
• Should be self-motivated, energetic, able to work in a dynamic environment focusing on outcomes, and demonstrate a high level of resilience.
● Ago-getter, and collaborator who is keen to learn and is highly receptive to feedback.
Role Overview
We are looking for a Linux Engineer with strong expertise in Red Hat Enterprise Linux (RHEL) and Identity Management (IdM) to manage centralized authentication, access control, directory services, and secure identity operations across Linux systems.
Key Responsibilities
- Install, configure, and maintain Red Hat Identity Management (IdM) / FreeIPA.
- Manage identity lifecycle: users, groups, roles, sudo rules, HBAC policies.
- Configure and troubleshoot SSSD, LDAP, Kerberos, and PAM authentication.
- Integrate Linux servers with Active Directory using trusts or SSSD.
- Manage DNS, certificates (Dogtag CA), and multi-master replication within IdM.
- Join and enroll Linux hosts into the IdM domain.
- Automate identity operations using Ansible and shell scripting.
- Troubleshoot authentication/login issues across servers and applications.
- Work closely with security and platform teams to enforce IAM/IAP policies.
Required Skills
- Strong RHEL/Linux administration skills.
- Hands-on experience with RedHat IdM / FreeIPA.
- Deep understanding of LDAP, Kerberos, SSSD, DNS, and PKI.
- Experience integrating Linux with AD.
- Shell scripting; Ansible automation experience.
Good to Have
- Exposure to Okta, Azure AD, Keycloak, or other IAM platforms.
- Cloud (AWS/Azure/GCP) experience.
- Security hardening knowledge.
Note: One technical round is mandatory to be taken either from Baner, Pune or Bellandur, Bangalore office.
Job Description: Python Engineer
Role Summary
We are looking for a talented Python Engineer to design, develop, and maintain high-quality backend applications and automation solutions. The ideal candidate should have strong programming skills, familiarity with modern development practices, and the ability to work in a fast-paced, collaborative environment.
Key Responsibilities:
Python Development & Automation
- Design, develop, and maintain Python scripts, tools, and automation frameworks.
- Build automation for operational tasks such as deployment, monitoring, system checks, and maintenance.
- Write clean, modular, and well-documented Python code following best practices.
- Develop APIs, CLI tools, or microservices when required.
Linux Systems Engineering
- Manage, configure, and troubleshoot Linux environments (RHEL, CentOS, Ubuntu).
- Perform system performance tuning, log analysis, and root-cause diagnostics.
- Work with system services, processes, networking, file systems, and security controls.
- Implement shell scripting (bash) alongside Python for system-level automation.
CI/CD & Infrastructure Support
- Support integration of Python automation into CI/CD pipelines (Jenkins).
- Participate in build and release processes for infrastructure components.
- Ensure automation aligns with established infrastructure standards and governance.
- Use Bash scripting together with Python to improve automation efficiency.
Cloud & DevOps Collaboration (if applicable)
- Collaborate with Cloud/DevOps engineers on automation for AWS or other cloud platforms.
- Integrate Python tools with configuration management tools such as Chef or Ansible, or with Terraform modules.
- Contribute to containerization efforts (Docker, Kubernetes) leveraging Python automation.
🚀 Hiring: MEAN Stack Developer
⭐ Experience: 5+ Years
📍 Location: Pune, Bhopal, Jaipur, Gurgaon, Bangalore
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
Passport: Mandatory & Valid
(Only immediate joiners & candidates serving notice period)
Key Responsibilities
- Design and develop high-performance front-end features using Angular and JavaScript/TypeScript.
- Build robust RESTful APIs and server-side logic using Node.js and Express.js.
- Work with databases like MongoDB, caching systems, and cloud deployment environments.
- Optimize applications for maximum speed, scalability, and reliability.
- Ensure responsive, pixel-perfect design and participate in code reviews.
Required Skills
- Must-Have: Node.js, Angular, JavaScript, HTML, CSS
- Good to Have: TypeScript, MongoDB, Express.js, Git, CI/CD exposure
We are currently looking to hire a Sales Manager –(FMCG Division) to join our team.
Position: Sales Manager (FMCG Division)
Location: Pune
Experience Required: : 5–7 years of FMCG (Food) sales experience — experience in Modern Trade / E commerce / Quick Commerce / HoReCa is essential.
Qualification: Bachelor’s Degree required; MBA / Master’s in Marketing preferred.
Job Description :
We’re seeking a driven and detail-oriented Sales Manager to lead FMCG expansion across Modern Trade, Quick Commerce, HoReCa, and E-commerce (Amazon) channels.
The ideal candidate will combine strong sales acumen with operational discipline, brand understanding, and team leadership to drive growth and profitability across verticals.
Key Responsibilities :
Sales & Business Development
Develop and execute sales strategies across Modern Trade, Quick Commerce, HoReCa, and E-commerce (Amazon) channels.
Drive Quick Commerce tie-ups (Blinkit, Zepto, Swiggy Instamart, etc.)
ensuring consistent fill rates, visibility, and on-time delivery.
Expand and manage Modern Trade business — new tie-ups, branding opportunities, and store-level execution.
Acquire and manage HoReCa accounts, ensuring strong partnerships and growth.
Explore and onboard new retail store tie-ups and distributors to strengthen reach and
market penetration.
Maintain vertical-wise P&L accountability, ensuring profitability across all sales channels.
Ensure payment cycles are monitored and reconciled as per company policy. E-Commerce & Brand Coordination
Oversee Amazon listings and catalog management, ensuring all products are correctly priced, described, and optimized.
Maintain online hygiene — ensuring accurate inventory, images, descriptions, ratings, and timely response to customer queries.
Coordinate with the marketing team for updated creatives, offers, product launches, and campaign rollouts across digital platforms.
Identify and execute branding opportunities within partner stores and online marketplaces.
Team & Operations :
Conduct regular store and market visits to ensure proper visibility, placement, and compliance.
Train, mentor, and develop the sales and promoter team to enhance performance and product knowledge.
Maintain strong vendor relationships and follow-up for timely supply and collections.
Collaborate closely with operations for smooth delivery logistics and minimal wastage (RTC losses).
Provide market insights, competitor analysis, and category feedback to guide business strategy.
Qualifications & Requirements
Bachelor’s Degree required; MBA / Master’s in Marketing preferred.
5–7 years of FMCG (Food) sales experience — experience in Modern Trade / E-commerce / Quick Commerce / HoReCa is essential.
Strong leadership and team-building capabilities.
Excellent communication, negotiation, and relationship management skills.
Hands-on experience in Amazon / online retail operations preferred.
Analytical mindset with sound knowledge of market trends, pricing, and brand visibility strategies.
Role: Mid–Senior Cloud Infrastructure Engineer
Type: Individual Contributor
Location: Baner, Pune
Experience: 4–9 years
About the Role
We are establishing a Mid–Senior Cloud Infrastructure Engineer CoE with a focus on cloud-native automation. As a Cloud Infrastructure Engineer, you will design, automate, and maintain AWS infrastructure using Terraform following IaC and platform engineering best practices.
Key Responsibilities
- Build, automate, and maintain AWS infrastructure using Terraform (remote state, modules, pipelines).
- Design secure, scalable AWS environments (VPC, EC2, IAM, ALB/NLB, S3, RDS, Lambda, ECS/EKS preferred).
- Create reusable Terraform modules for platform-wide standardization.
- Automate provisioning workflows for Linux and Windows environments.
- Integrate IaC into CI/CD (Jenkins/GitHub Actions/GitLab).
- Manage environment lifecycle: dev → test → stage → prod.
- Implement cost optimization, tagging strategies, and operational guardrails.
- Maintain infrastructure documentation and reusable patterns.
- Troubleshoot cloud deployments, networking, IAM policies, and automation issues.
Required Skills
- Strong hands-on experience with AWS services.
- Solid expertise with Terraform (workspaces, state mgmt, modules, CI/CD integration).
- Understanding of multi-tier architectures, networking, security groups, IAM.
- Familiarity with Linux/Windows system administration.
- Knowledge of scripting (Python/Bash/PowerShell).
Good to Have
- AWS Certified (SA / SysOps / DevOps — not mandatory).
- Experience with container orchestration (ECS/EKS).
- Experience with monitoring tools (CloudWatch, Grafana, Prometheus).
Soft Skills
- Strong analytical and troubleshooting capability.
- Ability to collaborate closely with DevOps/Platform teams.
NOTE: 2nd Technical round has to be taken F2F from Baner Office in Pune.
We’re Hiring: Social Media Executive – Pune
💼 Exp: 4–6 yrs (F&B/Hospitality/Lifestyle)
🗓️ 6 Days Working
✨ Role Overview
• 📅 Build monthly content calendars
• 🎥 Create reels, posts, stories & trend-driven content
• 🎨 Maintain brand tone, design & aesthetics
• 💬 Handle DMs, comments & community engagement
• 🤝 Manage influencers, collabs, contests & campaigns
• 🎪 Amplify events/pop-ups with real-time content
• 📊 Track KPIs, insights & prepare reports
• 💰 Manage budgets, vendors & production
• 🤝 Work with Marketing team for aligned brand strategy
🧩 Requirements
• 🔥 4–6 yrs in social media for F&B/Lifestyle brands
• 📱 Strong in IG, FB, LinkedIn (new platforms = bonus)
• 📈 Trend spotting + analytics expertise
• 🖌️ Skilled in Canva/PS/Reels editing
• ✍️ Creative in writing, visuals & storytelling
• 🧵 Ability to manage 2 brands smoothly
Key Responsibilities:
Design, develop, and implement SAP CPI (Cloud Platform Integration) and SAP PI/PO interfaces.
Work on end-to-end integration scenarios across SAP and non-SAP systems.
Develop IFlows, mappings (Graphical, XSLT, Java), and configure adapters.
Perform integration testing, troubleshooting, and performance optimization.
Collaborate with functional and technical teams to understand business requirements.
Monitor, support, and enhance existing integration solutions.
Ensure documentation of configurations, technical designs, and integration flows.
Key Skills & Expertise:
Strong hands-on experience in SAP CPI / HCI.
Solid experience with SAP PI/PO (Single Stack).
Expertise in REST, SOAP, OData, SFTP, IDOC, RFC integrations.
Knowledge of XML, JSON, Groovy scripting, and API management.
Experience in handling end-to-end implementation & support projects.
Good understanding of integration architecture and cloud connectivity
Job Summary:
Deqode is looking for a highly motivated and experienced Python + AWS Developer to join our growing technology team. This role demands hands-on experience in backend development, cloud infrastructure (AWS), containerization, automation, and client communication. The ideal candidate should be a self-starter with a strong technical foundation and a passion for delivering high-quality, scalable solutions in a client-facing environment.
Key Responsibilities:
- Design, develop, and deploy backend services and APIs using Python.
- Build and maintain scalable infrastructure on AWS (EC2, S3, Lambda, RDS, etc.).
- Automate deployments and infrastructure with Terraform and Jenkins/GitHub Actions.
- Implement containerized environments using Docker and manage orchestration via Kubernetes.
- Write automation and scripting solutions in Bash/Shell to streamline operations.
- Work with relational databases like MySQL and SQL, including query optimization.
- Collaborate directly with clients to understand requirements and provide technical solutions.
- Ensure system reliability, performance, and scalability across environments.
Required Skills:
- 3.5+ years of hands-on experience in Python development.
- Strong expertise in AWS services such as EC2, Lambda, S3, RDS, IAM, CloudWatch.
- Good understanding of Terraform or other Infrastructure as Code tools.
- Proficient with Docker and container orchestration using Kubernetes.
- Experience with CI/CD tools like Jenkins or GitHub Actions.
- Strong command of SQL/MySQL and scripting with Bash/Shell.
- Experience working with external clients or in client-facing roles
.
Preferred Qualifications:
- AWS Certification (e.g., AWS Certified Developer or DevOps Engineer).
- Familiarity with Agile/Scrum methodologies.
- Strong analytical and problem-solving skills.
- Excellent communication and stakeholder management abilities.
Job Description:
We are looking for a skilled Backend Developer with 2–5 years of experience in software development, specializing in Python and/or Golang. If you have strong programming skills, enjoy solving problems, and want to work on secure and scalable systems, we'd love to hear from you!
Location - Pune, Baner.
Interview Rounds - In Office
Key Responsibilities:
Design, build, and maintain efficient, reusable, and reliable backend services using Python and/or Golang
Develop and maintain clean and scalable code following best practices
Apply Object-Oriented Programming (OOP) concepts in real-world development
Collaborate with front-end developers, QA, and other team members to deliver high-quality features
Debug, optimize, and improve existing systems and codebase
Participate in code reviews and team discussions
Work in an Agile/Scrum development environment
Required Skills: Strong experience in Python or Golang (working knowledge of both is a plus)
Good understanding of OOP principles
Familiarity with RESTful APIs and back-end frameworks
Experience with databases (SQL or NoSQL)
Excellent problem-solving and debugging skills
Strong communication and teamwork abilities
Good to Have:
Prior experience in the security industry
Familiarity with cloud platforms like AWS, Azure, or GCP
Knowledge of Docker, Kubernetes, or CI/CD tools
A Desktop Support Engineer is responsible for providing technical support and assistance to end-users in an organization. This role involves troubleshooting hardware and software issues, ensuring that desktop systems are functioning efficiently, and maintaining a high level of customer satisfaction.
Key Responsibilities:
- Respond to user inquiries and provide technical support via phone, email, or in-person.
- Diagnose and resolve hardware and software problems, including operating systems, applications, and network connectivity issues.
- Install, configure, and upgrade desktop hardware and software, ensuring compliance with company standards.
- Maintain inventory of desktop equipment and software licenses, ensuring proper documentation and tracking.
- Collaborate with IT teams to implement new technologies and improve existing systems.
- Provide training and support to users on new software applications and tools.
- Assist in the setup and deployment of new workstations and peripherals.
- Monitor and maintain system performance, applying updates and patches as necessary.
- Document technical procedures and solutions for future reference.
Qualifications:
- Bachelor’s degree in computer science, information technology, or a related field, or equivalent experience.
- Proven experience in a desktop support role or similar technical support position.
- Strong knowledge of Windows and macOS operating systems, as well as common software applications.
- Familiarity with networking concepts and troubleshooting techniques.
- Excellent problem-solving skills and the ability to work under pressure.
- Strong communication skills, both verbal and written, with a customer-focused attitude.
- Relevant certifications (e.g., CompTIA A+, Microsoft Certified Desktop Support Technician) are a plus.
This role is essential for maintaining the productivity of employees by ensuring that their desktop environments are operational and efficient. A successful Desktop Support Engineer will be proactive, detail-oriented, and able to work independently as well as part of a team.




















