Cutshort logo

50+ Startup Jobs in Pune | Startup Job openings in Pune

Apply to 50+ Startup Jobs in Pune on CutShort.io. Explore the latest Startup Job opportunities across top companies like Google, Amazon & Adobe.

icon
Hunarstreet Technologies pvt ltd

Hunarstreet Technologies pvt ltd

Agency job
Pune
7 - 15 yrs
₹6L - ₹12L / yr
Sales
horeca
food

We’re seeking a driven and detail-oriented Sales Manager to lead Acasa’s FMCG expansion across Modern Trade, Quick Commerce, HoReCa, and E-commerce (Amazon) channels.


The ideal candidate will combine strong sales acumen with operational discipline, brand understanding, and team leadership to drive growth and profitability across verticals.


Key Responsibilities

Sales & Business Development

 Develop and execute sales strategies across Modern Trade, Quick Commerce, HoReCa, and E-commerce (Amazon) channels.

 Drive Quick Commerce tie-ups (Blinkit, Zepto, Swiggy Instamart, etc.)

ensuring consistent fill rates, visibility, and on-time delivery.

 Expand and manage Modern Trade business — new tie-ups, branding opportunities, and store-level execution.

 Acquire and manage HoReCa accounts, ensuring strong partnerships and growth.

 Explore and onboard new retail store tie-ups and distributors to strengthen reach and market penetration.

 Maintain vertical-wise P&L accountability, ensuring profitability across all sales channels.

 Ensure payment cycles are monitored and reconciled as per company policy. E-Commerce & Brand Coordination

 Oversee Amazon listings and catalog management, ensuring all products are correctly priced, described, and optimized.

 Maintain online hygiene — ensuring accurate inventory, images, descriptions, ratings, and timely response to customer queries.

 Coordinate with the marketing team for updated creatives, offers, product launches, and campaign rollouts across digital platforms.


 Identify and execute branding opportunities within partner stores and online marketplaces.

Team & Operations

 Conduct regular store and market visits to ensure proper visibility, placement, and compliance.

 Train, mentor, and develop the sales and promoter team to enhance performance and product knowledge.

 Maintain strong vendor relationships and follow-up for timely supply and collections.

 Collaborate closely with operations for smooth delivery logistics and minimal wastage (RTC losses).

 Provide market insights, competitor analysis, and category feedback to guide business strategy.


Qualifications & Requirements

 Bachelor’s Degree required; MBA / Master’s in Marketing preferred.


 5–7 years of FMCG (Food) sales experience — experience in Modern Trade / E- commerce / Quick Commerce / HoReCa is essential.


 Strong leadership and team-building capabilities.

 Excellent communication, negotiation, and relationship management skills.

 Hands-on experience in Amazon / online retail operations preferred.

 Analytical mindset with sound knowledge of market trends, pricing, and brand visibility strategies.

Read more
Virtana

at Virtana

2 candid answers
Krutika Devadiga
Posted by Krutika Devadiga
Pune
3 - 6 yrs
Best in industry
skill iconKubernetes
skill iconJava
skill iconDocker
Test Automation (QA)
Manual testing
+8 more

Position Overview:

As a Senior QA Engineer, you will play a critical role in driving quality across our product offerings. You will work closely with developers and product/support teams to ensure that our storage and networking monitoring solutions are thoroughly tested and meet enterprise-level reliability. A strong background in automation testing using Python and scripting is essential, along with proven debugging experience in enterprise products utilizing AWS, Cloud, and Kubernetes technologies. You will act as a key advocate for quality across the organization, interacting with diverse teams and stakeholders to push the boundaries of product excellence.


Work Location- Pune


Job Type- Hybrid


Key Responsibilities:

  • QA and Automation Testing: Come up with exhaustive test plans and automation test-cases using Python and scripting languages to validate end to end real world scenarios.
  • Enterprise Product Testing: Test enterprise-grade products deployed in AWS, Cloud, and Kubernetes environments, ensuring that they perform optimally in large-scale, real-world scenarios.
  • Debugging and Issue Resolution: Work closely with development teams to identify, debug, and resolve issues in enterprise-level products, ensuring high-quality and reliable product releases.
  • Test Automation Frameworks: Develop and maintain test automation frameworks to streamline testing processes, reduce manual testing efforts, and increase test coverage.
  • Customer Interaction: Be open to interacting with cross-geo customers to understand their quality requirements, test against real-world use cases, and ensure their satisfaction with product performance.
  • Voice of Quality: Act as an advocate for quality within the organization, pushing for excellence in product development and championing improvements in testing practices and processes.
  • Documentation: Create and maintain detailed documentation of testing processes, test cases, and issue resolutions, enabling knowledge sharing and consistent quality assurance practices.


Qualifications:

  • Bachelor’s or master’s degree in computer science, Software Engineering, or a related field.
  • 3+ years of hands-on experience in QA and automation testing, with a strong focus on Python and scripting.
  • Proven experience in testing and debugging enterprise products deployed in AWS, Cloud, and Kubernetes environments.
  • Solid understanding of storage and networking domains, with practical exposure to monitoring use-cases.
  • Strong experience with automation testing frameworks, including the development and execution of automated test cases.
  • Excellent debugging, problem-solving, and analytical skills.
  • Strong communication skills, with the ability to collaborate with diverse teams across geographies and time zones.
  • Experience in working in agile development environments, with a focus on continuous integration and delivery.
  • Passion for quality and a relentless drive to push the boundaries of what can be achieved in product excellence.


Company Overview:

Virtana delivers the industry’s only unified platform for Hybrid Cloud Performance, Capacity and Cost Management. Our platform provides unparalleled, real-time visibility into the performance, utilization, and cost of infrastructure across the hybrid cloud – empowering customers to manage their mission critical applications across physical, virtual, and cloud computing environments. Our SaaS platform allows organizations to easily manage and optimize their spend in the public cloud, assure resources are performing properly through real-time monitoring, and provide the unique ability to plan migrations across the hybrid cloud.


As we continue to expand our portfolio, we are seeking a highly skilled and hands-on Senior QA Engineer with strong Automation focus to contribute to the futuristic development of our Platform.


Why Join Us

  • Opportunity to play a pivotal role in driving quality for a leading performance monitoring company with a focus on storage and networking monitoring.
  • Collaborative and innovative work environment with a global team.
  • Competitive salary and benefits package.
  • Professional growth and development opportunities.
  • Exposure to cutting-edge technologies and enterprise-level challenges.


If you are a passionate QA Engineer with a strong background in automation, testing, and debugging in AWS, Cloud, and Kubernetes environments, and if you are eager to be the voice of quality in a rapidly growing company, we invite you to apply and help us raise the bar on product excellence.

Read more
Alliance Recruitment Agency
Raveena Korani
Posted by Raveena Korani
Pune
2 - 5 yrs
₹1L - ₹3L / yr
Direct sales
B2B Marketing
Communication Skills

·       Conduct telecalling and cold calling to potential clients (target: 50 calls per day).

·       Generate new business leads, especially MNCs or corporates planning expansion, relocation, or renovation of their office spaces.

·       Maintain and update lead databases and contact lists in Excel or CRM tools.

·       Conduct online research to identify new companies, decision-makers, and opportunities in target markets.

·       Coordinate with the design and project teams to share qualified leads for proposal preparation.

·       Record and update all client communications, follow-ups, and meeting schedules systematically.

·       Prepare and submit regular reports on lead generation activities and progress.

Support marketing and business outreach campaigns.

Read more
NeoGenCode Technologies Pvt Ltd
Shivank Bhardwaj
Posted by Shivank Bhardwaj
Pune
6 - 8 yrs
₹12L - ₹22L / yr
skill iconNodeJS (Node.js)
skill iconReact.js
skill iconJavascript
skill iconGo Programming (Golang)
Elixir
+10 more


Job Description – Full Stack Developer (React + Node.js)

Experience: 5–8 Years

Location: Pune

Work Mode: WFO

Employment Type: Full-time


About the Role

We are looking for an experienced Full Stack Developer with strong hands-on expertise in React and Node.js to join our engineering team. The ideal candidate should have solid experience building scalable applications, working with production systems, and collaborating in high-performance tech environments.


Key Responsibilities

  • Design, develop, and maintain scalable full-stack applications using React and Node.js.
  • Collaborate with cross-functional teams to define, design, and deliver new features.
  • Write clean, maintainable, and efficient code following OOP/FP and SOLID principles.
  • Work with relational databases such as PostgreSQL or MySQL.
  • Deploy and manage applications in cloud environments (preferably GCP or AWS).
  • Optimize application performance, troubleshoot issues, and ensure high availability in production systems.
  • Utilize containerization tools like Docker for efficient development and deployment workflows.
  • Integrate third-party services and APIs, including AI APIs and tools.
  • Contribute to improving development processes, documentation, and best practices.


Required Skills

  • Strong experience with React.js (frontend).
  • Solid hands-on experience with Node.js (backend).
  • Good understanding of relational databases: PostgreSQL / MySQL.
  • Experience working in production environments and debugging live systems.
  • Strong understanding of OOP or Functional Programming, and clean coding standards.
  • Knowledge of Docker or other containerization tools.
  • Experience with cloud platforms (GCP or AWS).
  • Excellent written and verbal communication skills.


Good to Have

  • Experience with Golang or Elixir.
  • Familiarity with Kubernetes, RabbitMQ, Redis, etc.
  • Contributions to open-source projects.
  • Previous experience working with AI APIs or machine learning tools.


Read more
Banking Industry

Banking Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mangalore, Pune, Mumbai
3 - 5 yrs
₹8L - ₹11L / yr
skill iconData Analytics
SQL
Relational Database (RDBMS)
skill iconJava
skill iconPython
+1 more

Required Skills: Strong SQL Expertise, Data Reporting & Analytics, Database Development, Stakeholder & Client Communication, Independent Problem-Solving & Automation Skills

 

Review Criteria

· Must have Strong SQL skills (queries, optimization, procedures, triggers)

· Must have Advanced Excel skills

· Should have 3+ years of relevant experience

· Should have Reporting + dashboard creation experience

· Should have Database development & maintenance experience

· Must have Strong communication for client interactions

· Should have Ability to work independently

· Willingness to work from client locations.

 

Description

Who is an ideal fit for us?

We seek professionals who are analytical, demonstrate self-motivation, exhibit a proactive mindset, and possess a strong sense of responsibility and ownership in their work.

 

What will you get to work on?

As a member of the Implementation & Analytics team, you will:

● Design, develop, and optimize complex SQL queries to extract, transform, and analyze data

● Create advanced reports and dashboards using SQL, stored procedures, and other reporting tools

● Develop and maintain database structures, stored procedures, functions, and triggers

● Optimize database performance by tuning SQL queries, and indexing to handle large datasets efficiently

● Collaborate with business stakeholders and analysts to understand analytics requirements

● Automate data extraction, transformation, and reporting processes to improve efficiency


What do we expect from you?

For the SQL/Oracle Developer role, we are seeking candidates with the following skills and Expertise:

● Proficiency in SQL (Window functions, stored procedures) and MS Excel (advanced Excel skills)

● More than 3 plus years of relevant experience

● Java / Python experience is a plus but not mandatory

● Strong communication skills to interact with customers to understand their requirements

● Capable of working independently with minimal guidance, showcasing self-reliance and initiative

● Previous experience in automation projects is preferred

● Work From Office: Bangalore/Navi Mumbai/Pune/Client locations

 

Read more
Virtana

at Virtana

2 candid answers
Krutika Devadiga
Posted by Krutika Devadiga
Pune
4 - 10 yrs
Best in industry
skill iconJava
skill iconKubernetes
skill iconGo Programming (Golang)
skill iconPython
Apache Kafka
+13 more

Senior Software Engineer 

Challenge convention and work on cutting edge technology that is transforming the way our customers manage their physical, virtual and cloud computing environments. Virtual Instruments seeks highly talented people to join our growing team, where your contributions will impact the development and delivery of our product roadmap. Our award-winning Virtana Platform provides the only real-time, system-wide, enterprise scale solution for providing visibility into performance, health and utilization metrics, translating into improved performance and availability while lowering the total cost of the infrastructure supporting mission-critical applications.  

We are seeking an individual with expert knowledge in Systems Management and/or Systems Monitoring Software, Observability platforms and/or Performance Management Software and Solutions with insight into integrated infrastructure platforms like Cisco UCS, infrastructure providers like Nutanix, VMware, EMC & NetApp and public cloud platforms like Google Cloud and AWS to expand the depth and breadth of Virtana Products. 


Work Location: Pune/ Chennai


Job Type: Hybrid

 

Role Responsibilities: 

  • The engineer will be primarily responsible for architecture, design and development of software solutions for the Virtana Platform 
  • Partner and work closely with cross functional teams and with other engineers and product managers to architect, design and implement new features and solutions for the Virtana Platform. 
  • Communicate effectively across the departments and R&D organization having differing levels of technical knowledge.  
  • Work closely with UX Design, Quality Assurance, DevOps and Documentation teams. Assist with functional and system test design and deployment automation 
  • Provide customers with complex and end-to-end application support, problem diagnosis and problem resolution 
  • Learn new technologies quickly and leverage 3rd party libraries and tools as necessary to expedite delivery 

 

Required Qualifications:    

  • Minimum of 7+ years of progressive experience with back-end development in a Client Server Application development environment focused on Systems Management, Systems Monitoring and Performance Management Software. 
  • Deep experience in public cloud environment using Kubernetes and other distributed managed services like Kafka etc (Google Cloud and/or AWS) 
  • Experience with CI/CD and cloud-based software development and delivery 
  • Deep experience with integrated infrastructure platforms and experience working with one or more data collection technologies like SNMP, REST, OTEL, WMI, WBEM. 
  • Minimum of 6 years of development experience with one or more of these high level languages like GO, Python, Java. Deep experience with one of these languages is required. 
  • Bachelor’s or Master’s degree in computer science, Computer Engineering or equivalent 
  • Highly effective verbal and written communication skills and ability to lead and participate in multiple projects 
  • Well versed with identifying opportunities and risks in a fast-paced environment and ability to adjust to changing business priorities 
  • Must be results-focused, team-oriented and with a strong work ethic 

 

Desired Qualifications: 

  • Prior experience with other virtualization platforms like OpenShift is a plus 
  • Prior experience as a contributor to engineering and integration efforts with strong attention to detail and exposure to Open-Source software is a plus 
  • Demonstrated ability as a lead engineer who can architect, design and code with strong communication and teaming skills 
  • Deep development experience with the development of Systems, Network and performance Management Software and/or Solutions is a plus 

  

About Virtana:  Virtana delivers the industry’s only broadest and deepest Observability Platform that allows organizations to monitor infrastructure, de-risk cloud migrations, and reduce cloud costs by 25% or more. 

  

Over 200 Global 2000 enterprise customers, such as AstraZeneca, Dell, Salesforce, Geico, Costco, Nasdaq, and Boeing, have valued Virtana’s software solutions for over a decade. 

  

Our modular platform for hybrid IT digital operations includes Infrastructure Performance Monitoring and Management (IPM), Artificial Intelligence for IT Operations (AIOps), Cloud Cost Management (Fin Ops), and Workload Placement Readiness Solutions. Virtana is simplifying the complexity of hybrid IT environments with a single cloud-agnostic platform across all the categories listed above. The $30B IT Operations Management (ITOM) Software market is ripe for disruption, and Virtana is uniquely positioned for success. 

Read more
ImmersiveDataAI
Ishan  Agrawal
Posted by Ishan Agrawal
Pune
0 - 1 yrs
₹10000 - ₹20000 / mo
skill iconPython
SQL
Large Language Models (LLM) tuning
Data engineering

Entry Level | On-Site | Pune

Internship Opportunity: Data + AI Intern

Location: Pune, India (In-office)

Duration: 2 Months

Start Date: Between 11th July 2025 and 15th August 2025

Work Days: Monday to Friday

Stipend: As per company policy

About ImmersiveData.AI

Smarter Data. Smarter Decisions. Smarter Enterprises.™

At ImmersiveData.AI, we don’t just transform data—we challenge and redefine business models. By leveraging cutting-edge AI, intelligent automation, and modern data platforms, we empower enterprises to unlock new value and drive strategic transformation.

About the Internship

As a Data + AI Intern, you will gain hands-on experience at the intersection of data engineering and AI. You’ll be part of a collaborative team working on real-world data challenges using modern tools like SnowflakeDBTAirflow, and LLM frameworks. This internship is a launchpad for students looking to enter the rapidly evolving field of Data & AI.

Key Responsibilities

  • Assist in designing, building, and optimizing data pipelines and ETL workflows
  • Work with structured and unstructured datasets across various sources
  • Contribute to AI-driven automation and analytics use cases
  • Support backend integration of large language models (LLMs)
  • Collaborate in building data platforms using tools like SnowflakeDBT, and Airflow

Required Skills

  • Proficiency in Python
  • Strong understanding of SQL and relational databases
  • Basic knowledge of Data Engineering and Data Analysis concepts
  • Familiarity with cloud data platforms or willingness to learn (e.g., Snowflake)

Preferred Learning Certifications (Optional but Recommended)

  • Python Programming
  • SQL & MySQL/PostgreSQL
  • Statistical Modeling
  • Tableau / Power BI
  • Voice App Development (Bonus)

Who Can Apply

Only candidates who:

  • Are available full-time (in-office, Pune)
  • Can start between 11th July and 15th August 2025
  • Are available for a minimum of 2 months
  • Have relevant skills and interest in data and AI

Perks

  • Internship Certificate
  • Letter of Recommendation
  • Work with cutting-edge tools and technologies
  • Informal dress code
  • Exposure to real industry use cases and mentorship


Read more
Pratiti Technologies
Agency job
via AccioJob by AccioJobHiring Board
Pune
0 - 1 yrs
₹5L - ₹5L / yr
skill iconC#

AccioJob is conducting a Walk-In Hiring Drive with Pratiti Technologies for the position of C# Developer.


To apply, register and select your slot here: https://go.acciojob.com/bV6nke


Required Skills: C#, .NET, OOPs


Eligibility:

  • Degree: BTech./BE
  • Branch: All
  • Graduation Year: 2025, 2026


Work Details:

  • Work Location: Pune (Onsite)
  • CTC: 5 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Pune Centre


Further Rounds (for shortlisted candidates only):

Profile Evaluation, Coding Assignment, Telephonic Screening, Technical Interview 1, Technical Interview 2, HR Discussion


Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/bV6nke


FAST SLOT BOOKING

[ DOWNLOAD ACCIOJOB APP ]

https://go.acciojob.com/qWPaH9

Read more
Inteliment Technologies

at Inteliment Technologies

2 candid answers
Ariba Khan
Posted by Ariba Khan
Pune
4 - 7 yrs
Upto ₹20L / yr (Varies
)
SQL
Data modeling
Data Vault
ERwin
Star schema
+1 more

About the company:

Inteliment is a niche business analytics company with almost 2 decades proven track record of partnering with hundreds of fortunes 500 global companies. Inteliment operates its ISO certified development centre in Pune, India and has business operations in multiple countries through subsidiaries in Singapore, Europe and headquarter in India.


About the Role:

We are seeking an experienced Technical Data Professional with hands-on expertise in designing and implementing dimensional data models using Erwin or any dimensional model tool and building SQL-based solutions adhering to Data Vault 2.0 and Information Mart standards. The ideal candidate will have strong data analysis capabilities, exceptional SQL skills, and a deep understanding of data relationships, metrics, and granularity of the data structures.


Qualifications:

  • Bachelor’s degree in computer science, Information Technology, or a related field.
  • Certifications with related field will be an added advantage.


Key Competencies:

1.Technical Expertise:

  • Proficiency in Erwin for data modeling.
  • Advanced SQL skills with experience in writing and optimizing performance driven queries.
  • Hands-on experience with Data Vault 2.0 and Information Mart standards is highly preferred.
  • Solid understanding of Star Schema, Facts & Dimensions, and Business Unit (BU) architecture.

2. Analytical Skills:

  • Strong data analysis skills to evaluate data relationships, metrics, and granularities.
  • Capability to troubleshoot and resolve complex data modeling and performance issues.

3.Soft Skills:

  • Strong problem-solving and decision-making skills.
  • Excellent communication and stakeholder management abilities.
  • Proactive and detail-oriented with a focus on delivering high-quality results.


Key Responsibilities:

1. Dimensional Data Modeling:

  • Design and develop dimensional data models using Erwin with a focus on Star Schema and BUS architecture (Fact and Dimension tables).
  • Ensure models align with business requirements and provide scalability, performance, and maintainability.

2. SQL Development:

  • Implement data models in SQL using best practices for view creation, ensuring high performance.
  • Write, optimize, and refactor complex SQL queries for efficiency and performance in large-scale databases.
  • Develop solutions adhering to Information Mart and Data Vault 2.0 standards. (Dimensional model that is built using Raw Data vault tables Hubs, Links, satellites, Effectivity satellites , Bridge and PIT tables from Data Vault.)

3. Data Analysis & Relationship Metrics:

  • Perform in-depth data analysis to identify patterns, relationships, and metrics at different levels of granularity.
  • Ensure data integrity and quality by validating data models against business expectations.

4. Performance Optimization:

  • Conduct performance tuning of existing data structures, queries, and ETL processes.
  • Provide guidance on database indexing, partitioning, and query optimization techniques.

5. Collaboration:

  • Work closely with business stakeholders, data engineers, and analysts to understand and translate business needs into effective data solutions.
  • Support cross-functional teams to ensure seamless integration and delivery of data solutions
Read more
Pratiti Technologies
Agency job
via AccioJob by AccioJobHiring Board
Pune
0 - 1 yrs
₹5L - ₹5L / yr
skill iconJava
DSA

AccioJob is conducting a Walk-In Hiring Drive with Pratiti Technologies for the position of Java Backend Developer.


To apply, register and select your slot here: https://go.acciojob.com/8szZFc


Required Skills: Java, Intermediate DSA


Eligibility:

  • Degree: BTech./BE
  • Branch: All
  • Graduation Year: 2025, 2026


Work Details:

  • Work Location: Pune (Onsite)
  • CTC: 5 LPA to 5 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Pune Centre


Further Rounds (for shortlisted candidates only):

Profile Evaluation, Coding Assignment, Telephonic Screening, Technical Interview 1, Technical Interview 2, HR Discussion


Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/8szZFc


FAST SLOT BOOKING

[ DOWNLOAD ACCIOJOB APP ]

https://go.acciojob.com/yqkPXV

Read more
Inteliment Technologies

at Inteliment Technologies

2 candid answers
Ariba Khan
Posted by Ariba Khan
Pune
3 - 5 yrs
Upto ₹16L / yr (Varies
)
SQL
skill iconPython
ETL
skill iconAmazon Web Services (AWS)
Azure
+1 more

About the company:

Inteliment is a niche business analytics company with almost 2 decades proven track record of partnering with hundreds of fortunes 500 global companies. Inteliment operates its ISO certified development centre in Pune, India and has business operations in multiple countries through subsidiaries in Singapore, Europe and headquarter in India.


About the Role:

As a Data Engineer, you will contribute to cutting-edge global projects and innovative product initiatives, delivering impactful solutions for our Fortune clients. In this role, you will take ownership of the entire data pipeline and infrastructure development lifecycle—from ideation and design to implementation and ongoing optimization. Your efforts will ensure the delivery of high-performance, scalable, and reliable data solutions. Join us to become a driving force in shaping the future of data infrastructure and innovation, paving the way for transformative advancements in the data ecosystem.


Qualifications:

  • Bachelor’s or master’s degree in computer science, Information Technology, or a related field.
  • Certifications with related field will be an added advantage.


Key Competencies:

  • Must have experience with SQL, Python and Hadoop
  • Good to have experience with Cloud Computing Platforms (AWS, Azure, GCP, etc.), DevOps Practices, Agile Development Methodologies
  • ETL or other similar technologies will be an advantage.
  • Core Skills: Proficiency in SQL, Python, or Scala for data processing and manipulation
  • Data Platforms: Experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Tools: Familiarity with tools like Apache Spark, Kafka, and modern data warehouses (e.g., Snowflake, Big Query, Redshift).
  • Soft Skills: Strong problem-solving abilities, collaboration, and communication skills to work effectively with technical and non-technical teams.
  • Additional: Knowledge of SAP would be an advantage 


Key Responsibilities:

  • Data Pipeline Development: Build, maintain, and optimize ETL/ELT pipelines for seamless data flow.
  • Data Integration: Consolidate data from various sources into unified systems.
  • Database Management: Design and optimize scalable data storage solutions.
  • Data Quality Assurance: Ensure data accuracy, consistency, and completeness.
  • Collaboration: Work with analysts, scientists, and stakeholders to meet data needs.
  • Performance Optimization: Enhance pipeline efficiency and database performance.
  • Data Security: Implement and maintain robust data security and governance policies
  • Innovation: Adopt new tools and design scalable solutions for future growth.
  • Monitoring: Continuously monitor and maintain data systems for reliability.
  • Data Engineers ensure reliable, high-quality data infrastructure for analytics and decision-making.
Read more
Atos Group

Atos Group

Agency job
via TrueTech Solutions by Pavan Kalyan
Chennai, Bengaluru (Bangalore), Pune, Mumbai
5 - 15 yrs
₹10L - ₹20L / yr
UFT
Automated testing
Automation
Perfecto
Test Automation (QA)

Primary Skill:

o           UFT and Perfecto.

o           Automation Testing

o           Test Script Development

o           Test Management Tools (e.g., ALM/Quality Center)

 

Responsibilities

           Design, develop, and execute automated test scripts using UFT and Perfecto tools.

           Collaborate with development and QA teams to identify test requirements and create test plans.

           Perform functional, regression, and performance testing to ensure software quality.

           Analyze test results, identify defects, and work with developers to resolve issues.

           Maintain and update existing automated test scripts and frameworks.

           Document and report testing activities, results, and software quality metrics.

           Ensure compliance with industry standards and best practices in testing.

Read more
Inteliment Technologies

at Inteliment Technologies

2 candid answers
Ariba Khan
Posted by Ariba Khan
Pune
3 - 5 yrs
Upto ₹12L / yr (Varies
)
PowerBI
SQL
DAX
Power Query

About the company:

Inteliment is a niche business analytics company with almost 2 decades proven track record of partnering with hundreds of fortunes 500 global companies. Inteliment operates its ISO certified development centre in Pune, India and has business operations in multiple countries through subsidiaries in Singapore, Europe and headquarter in India.


About the role:

As a Power BI Developer, you will work closely with business analysts, data engineers, and key stakeholders to transform complex datasets into actionable insights. Your expertise will be pivotal in designing and delivering visually engaging reports, dashboards, and data-driven stories that empower informed decision-making across the organization. By translating raw data into meaningful visuals, you will play a critical role in driving strategic initiatives and fostering a culture of data-driven excellence.


Qualifications:

  • Bachelor’s or master’s degree in computer sciences, Information Technology, or a related field.
  • Certifications with related field will be an added advantage


Key Competencies:

  • Technical Skills: Proficiency in Power BI, DAX, Power Query, SQL, and data visualization best practices.
  • Additional Tools: Familiarity with Azure Data Factory, Power Automate, and other components of the Power Platform is advantageous.
  • Soft Skills: Strong analytical thinking, problem-solving, and communication skills for interacting with technical and non-technical audiences.
  • Additional skills: Domain understanding is a plus


Key Responsibilities:

1. Data Integration & Modelling

  • Extract, transform, and load (ETL) data from various sources (SQL, Excel, APIs, etc.).
  • Design and develop efficient data models to support reporting needs.
  • Ensure data integrity and optimize performance through best practices.

2. Report Development

  • Understand the business requirement and build reports to provide analytical insights
  • Build visually appealing, interactive dashboards and reports in Power BI.
  • Implement DAX (Data Analysis Expressions) for complex calculations and measures.
  • Design user-friendly layouts that align with stakeholder requirements.

3. Collaboration

  • Work with stakeholders to gather business requirements and translate them into technical solutions.
  • Collaborate with data engineers and analysts to ensure cohesive reporting strategies.
  • Provide support and training for end-users to maximize adoption and usage of Power BI solutions

4. Performance Optimization

  • Optimize dashboards and reports for better speed and responsiveness.
  • Monitor and improve data refresh processes for real-time reporting.

5. Governance and Security

  • Implement row-level security (RLS) and adhere to organizational data governance policies.
  • Manage Power BI workspaces and permissions.

6. Continuous Improvement

  • Stay updated with Power BI features and industry trends.
  • Proactively recommend enhancements to existing solutions
Read more
Pratiti Technologies
Agency job
via AccioJob by AccioJobHiring Board
Pune
0 - 1 yrs
₹5L - ₹5L / yr
skill iconC++

AccioJob is conducting a Walk-In Hiring Drive with Pratiti Technologies for the position of C++ developer.


To apply, register and select your slot here: https://go.acciojob.com/PfCEBz


Required Skills: C++


Eligibility:

  • Degree: BTech./BE
  • Branch: All
  • Graduation Year: 2025, 2026


Work Details:

  • Work Location: Pune (Onsite)
  • CTC: 5 LPA to 5 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Pune Centre


Further Rounds (for shortlisted candidates only):

Profile Evaluation, Coding Assignment, Telephonic Screening, Technical Interview 1, Technical Interview 2, HR Discussion


Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/PfCEBz


FAST SLOT BOOKING

[ DOWNLOAD ACCIOJOB APP ]

https://go.acciojob.com/78Xvcm

Read more
Bits In Glass

at Bits In Glass

3 candid answers
Nikita Sinha
Posted by Nikita Sinha
Pune, Hyderabad, Mohali
8 - 11 yrs
Upto ₹40L / yr (Varies
)
Data modeling
Google Cloud Platform (GCP)

Responsibilities

  • Act as a liaison between business and technical teams to bridge gaps and support successful project delivery.
  • Maintain high-quality metadata and data artifacts that are accurate, complete, consistent, unambiguous, reliable, accessible, traceable, and valid.
  • Create and deliver high-quality data models while adhering to defined data governance practices and standards.
  • Translate high-level functional or business data requirements into technical solutions, including database design and data mapping.
  • Participate in requirement-gathering activities, elicitation, gap analysis, data analysis, effort estimation, and review processes.

Qualifications

  • 8–12 years of strong data analysis and/or data modeling experience.
  • Strong individual contributor with solid understanding of SDLC and Agile methodologies.
  • Comprehensive expertise in conceptual, logical, and physical data modeling.

Skills

  • Strong financial domain knowledge and data analysis capabilities.
  • Excellent communication and stakeholder management skills.
  • Ability to work effectively in a fast-paced and continuously evolving environment.
  • Problem-solving mindset with a solution-oriented approach.
  • Team player with a self-starter attitude and strong sense of ownership.
  • Proficiency in SQL, MS Office tools, GCP BigQuery, Erwin, and Visual Paradigm (preferred).
Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Pune
2 - 15 yrs
₹4L - ₹15L / yr
MS-Excel
Dialer system

Strong dialer manager profile

Mandatory (Experience 1):Must have 2+ years of core dialer operations experience with major dialer systems — Genesys, Avaya, Aspect, Ameyo

Mandatory (Experience 2): Must have hands-on experience in outbound campaign creation, list uploads, pacing changes, and real-time monitoring.

Mandatory (Technical Experience): Must understand routing/pacing optimization, retry logic, call attempts, ANI strategy, contact strategy, and compliance norms etc

Read more
Onepoint IT Consulting Pvt Ltd
Maithili Shetty
Posted by Maithili Shetty
Pune
0 - 1 yrs
₹4L - ₹4L / yr
skill iconFigma
Graphic Designing
skill iconAdobe XD

Onepoint does more than solve complex digital challenges. It is a place for you to explore your potential, elevate yourself, and transform your career.

As a UI/UX designer, you would be creating designs for websites, apps, presentations, and marketing materials to help our teams communicate eectively and deliver engaging experiences. You are passionate about user-centred design, visual storytelling, and exploring how AI tools can enhance creativity and eiciency. You have an eye for detail, and you don’t assume anything. Working together with skilled technical and marketing experts and collaborating closely with other designers, you will help create cohesive, world-class experiences that strengthen our brand and support business growth.


Please see the Job descrption for all the details of the position

https://www.onepointltd.com/wp-content/uploads/2025/10/PUNE_JD2025-10_IN.001-UI-UX-Designer.pdf


Read more
Pune
3 - 10 yrs
₹3L - ₹8L / yr
webflow
Search Engine Optimization (SEO)
Social Media Marketing (SMM)
Brand Management
User Interface (UI) Design
+2 more

Amplifai is a consultancy and investment company focused on helping organizations succeed with Growth, AI and automation. We work with startups, scale-ups, and established firms to translate complex business challenges into practical AI solutions. Alongside our advisory work, we build and launch our own ventures—such as Corevina, designed for private equity and investment companies. 

We are now looking for a creative and hands-on Marketing executive with 3-5 years of experience to join our growing team. This role will be central in shaping how Amplifai and our ventures are presented to the world. You’ll be the owner of web and social media presence and the driver behind a modern, professional digital presence. 

 

What you’ll do 

  • Build and manage our websites (ideally in Web-flow), ensuring they are visually strong, user-friendly, and up to date. 
  • Optimize our digital platforms for performance, accessibility, SEO’s and LLM’s. 
  • Create and acquire content and graphics for our digital presence. 
  • Social media marketing is predominantly on LinkedIn. 
  • Collaborate with colleagues across advisory, venture, and commercial teams to deliver assets that support sales, marketing, and product launches. 

 

What we’re looking for 

  • Experience building and maintaining websites, preferably in Web-flow
  • Experience with social media and email campaigns. 
  • Proficiency with tools such as Canva, or similar. 
  • Good understanding of UX/UI and responsive design. 
  • Experience with SEO and web analytics. 
  • Strong command of English (work language). 

 

Nice to have 

  • Skills in video editing or motion graphics. 
  • Background in B2B marketing or tech-driven companies. 

 

Who you’ll work with 

You’ll join a senior, entrepreneurial team with decades of experience in AI, automation, and business development. We’re commercially driven yet deeply technical operating at the intersection of strategy and innovation

Our culture is collaborative, fast-moving, and pragmatic: we move quickly from ideas to execution, and everyone has a direct impact on outcomes. 

The role reports to our India team and works closely with our Norwegian leadership

 

Why join us 

  • Shape and own the visual identity of a growing AI consultancy and its ventures. 
  • Enjoy creative freedom with real responsibility. 
  • Work in a dynamic, international environment where technology and design meet business impact. 
Read more
Velnir

at Velnir

1 candid answer
Swagatika swain
Posted by Swagatika swain
Remote, Pune
2 - 3 yrs
₹3L - ₹6L / yr
React.js
skill iconNodeJS (Node.js)
skill iconJavascript
TypeScript
skill iconExpress
+11 more

About the Role

We’re looking for a skilled Full Stack Developer with around 2 years of hands-on experience in building and deploying modern web applications. You’ll work closely with cross-functional teams to develop scalable solutions using the latest technologies.


Key Responsibilities

  • Develop, test, and deploy responsive web applications using modern frameworks.
  • Collaborate with designers to turn UI/UX concepts into functional products.
  • Build and integrate RESTful APIs and backend services.
  • Write clean, maintainable, and efficient code.
  • Debug, optimize, and improve application performance.
  • Manage containerized deployments and CI/CD workflows.



Required Skills

  • Must be a B.Tech graduate. (Computer science or IT )
  • Only Pune Candidate Preferred. 
  • Strong proficiency in JavaScriptTypeScriptReact.js, and Node.js.
  • Experience with Express.jsMongoDB / MySQL, and REST APIs.
  • Hands-on experience with Docker and cloud platforms (AWSAzure or GCP).
  • Familiarity with GitCI/CD pipelines, and basic DevOps practices.
  • Working knowledge of Next.js is a plus.
  • Excellent problem-solving and communication skills.
  • Immediate joiner Preffered.  


Nice to Have

  • Exposure to React Native or mobile app development.
  • Exposure to GraphQL
  • Experience with cloud infrastructure automation or serverless deployments.
  • Prior experience working in IT services or client delivery projects.


Perks

  • Hybrid work model (2–3 days in office, Pune)
  • Exposure to global client projects
  • Collaborative, fast-paced learning environment




Note: WE USE AN ATS, SO PLEASE MAKE YOUR CV CLEAR, WELL-STRUCTURED, AND INCLUDE RELEVANT SKILLS TO IMPROVE YOUR CHANCES OF BEING SHORTLISTED.



Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
6 - 12 yrs
₹15L - ₹30L / yr
skill iconMachine Learning (ML)
skill iconAmazon Web Services (AWS)
skill iconKubernetes
ECS
Amazon Redshift
+14 more

Core Responsibilities:

  • The MLE will design, build, test, and deploy scalable machine learning systems, optimizing model accuracy and efficiency
  • Model Development: Algorithms and architectures span traditional statistical methods to deep learning along with employing LLMs in modern frameworks.
  • Data Preparation: Prepare, cleanse, and transform data for model training and evaluation.
  • Algorithm Implementation: Implement and optimize machine learning algorithms and statistical models.
  • System Integration: Integrate models into existing systems and workflows.
  • Model Deployment: Deploy models to production environments and monitor performance.
  • Collaboration: Work closely with data scientists, software engineers, and other stakeholders.
  • Continuous Improvement: Identify areas for improvement in model performance and systems.

 

Skills:

  • Programming and Software Engineering: Knowledge of software engineering best practices (version control, testing, CI/CD).
  • Data Engineering: Ability to handle data pipelines, data cleaning, and feature engineering. Proficiency in SQL for data manipulation + Kafka, Chaossearch logs, etc for troubleshooting; Other tech touch points are ScyllaDB (like BigTable), OpenSearch, Neo4J graph
  • Model Deployment and Monitoring: MLOps Experience in deploying ML models to production environments.
  • Knowledge of model monitoring and performance evaluation.

 

Required experience:

  • Amazon SageMaker: Deep understanding of SageMaker's capabilities for building, training, and deploying ML models; understanding of the Sagemaker pipeline with ability to analyze gaps and recommend/implement improvements
  • AWS Cloud Infrastructure: Familiarity with S3, EC2, Lambda and using these services in ML workflows
  • AWS data: Redshift, Glue
  • Containerization and Orchestration: Understanding of Docker and Kubernetes, and their implementation within AWS (EKS, ECS)

 

Skills: Aws, Aws Cloud, Amazon Redshift, Eks

 

Must-Haves

Machine Learning +Aws+ (EKS OR ECS OR Kubernetes) + (Redshift AND Glue) + Sagemaker

Notice period - 0 to 15days only

Hybrid work mode- 3 days office, 2 days at home

Read more
Virtana

at Virtana

2 candid answers
Krutika Devadiga
Posted by Krutika Devadiga
Pune
8 - 13 yrs
Best in industry
skill iconJava
skill iconKubernetes
skill iconAmazon Web Services (AWS)
skill iconSpring Boot
skill iconGo Programming (Golang)
+13 more

Company Overview:

Virtana delivers the industry’s only unified platform for Hybrid Cloud Performance, Capacity and Cost Management. Our platform provides unparalleled, real-time visibility into the performance, utilization, and cost of infrastructure across the hybrid cloud – empowering customers to manage their mission critical applications across physical, virtual, and cloud computing environments. Our SaaS platform allows organizations to easily manage and optimize their spend in the public cloud, assure resources are performing properly through real-time monitoring, and provide the unique ability to plan migrations across the hybrid cloud. 

As we continue to expand our portfolio, we are seeking a highly skilled and hands-on Staff Software Engineer in backend technologies to contribute to the futuristic development of our sophisticated monitoring products.

 

Position Overview:

As a Staff Software Engineer specializing in backend technologies for Storage and Network monitoring in an AI enabled Data center as well as Cloud, you will play a critical role in designing, developing, and delivering high-quality features within aggressive timelines. Your expertise in microservices-based streaming architectures and strong hands-on development skills are essential to solve complex problems related to large-scale data processing. Proficiency in backend technologies such as Java, Python is crucial.



Work Location: Pune


Job Type: Hybrid

 

Key Responsibilities:

  • Hands-on Development: Actively participate in the design, development, and delivery of high-quality features, demonstrating strong hands-on expertise in backend technologies like Java, Python, Go or related languages.
  • Microservices and Streaming Architectures: Design and implement microservices-based streaming architectures to efficiently process and analyze large volumes of data, ensuring real-time insights and optimal performance.
  • Agile Development: Collaborate within an agile development environment to deliver features on aggressive schedules, maintaining a high standard of quality in code, design, and architecture.
  • Feature Ownership: Take ownership of features from inception to deployment, ensuring they meet product requirements and align with the overall product vision.
  • Problem Solving and Optimization: Tackle complex technical challenges related to data processing, storage, and real-time monitoring, and optimize backend systems for high throughput and low latency.
  • Code Reviews and Best Practices: Conduct code reviews, provide constructive feedback, and promote best practices to maintain a high-quality and maintainable codebase.
  • Collaboration and Communication: Work closely with cross-functional teams, including UI/UX designers, product managers, and QA engineers, to ensure smooth integration and alignment with product goals.
  • Documentation: Create and maintain technical documentation, including system architecture, design decisions, and API documentation, to facilitate knowledge sharing and onboarding.


Qualifications:

  • Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field.
  • 8+ years of hands-on experience in backend development, demonstrating expertise in Java, Python or related technologies.
  • Strong domain knowledge in Storage and Networking, with exposure to monitoring technologies and practices.
  • Experience is handling the large data-lakes with purpose-built data stores (Vector databases, no-SQL, Graph, Time-series).
  • Practical knowledge of OO design patterns and Frameworks like Spring, Hibernate.
  • Extensive experience with cloud platforms such as AWS, Azure or GCP and development expertise on Kubernetes, Docker, etc.
  • Solid experience designing and delivering features with high quality on aggressive schedules.
  • Proven experience in microservices-based streaming architectures, particularly in handling large amounts of data for storage and networking monitoring.
  • Familiarity with performance optimization techniques and principles for backend systems.
  • Excellent problem-solving and critical-thinking abilities.
  • Outstanding communication and collaboration skills.


Why Join Us:

  • Opportunity to be a key contributor in the development of a leading performance monitoring company specializing in AI-powered Storage and Network monitoring.
  • Collaborative and innovative work environment.
  • Competitive salary and benefits package.
  • Professional growth and development opportunities.
  • Chance to work on cutting-edge technology and products that make a real impact.


If you are a hands-on technologist with a proven track record of designing and delivering high-quality features on aggressive schedules and possess strong expertise in microservices-based streaming architectures, we invite you to apply and help us redefine the future of performance monitoring.

Read more
Deqode

at Deqode

1 recruiter
Samiksha Agrawal
Posted by Samiksha Agrawal
Mumbai, Pune, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Indore, Bengaluru (Bangalore)
4 - 7 yrs
₹4L - ₹10L / yr
skill iconJava
skill iconSpring Boot
Microservices
SQL
Hibernate (Java)

Job Description

Role: Java Developer

Location: PAN India

Experience:4+ Years

Required Skills -

  1. 3+ years Java development experience
  2. Spring Boot framework expertise (MANDATORY)
  3. Microservices architecture design & implementation (MANDATORY)
  4. Hibernate/JPA for database operations (MANDATORY)
  5. RESTful API development (MANDATORY)
  6. Database design and optimization (MANDATORY)
  7. Container technologies (Docker/Kubernetes)
  8. Cloud platforms experience (AWS/Azure)
  9. CI/CD pipeline implementation
  10. Code review and quality assurance
  11. Problem-solving and debugging skills
  12. Agile/Scrum methodology
  13. Version control systems (Git)


Read more
Hashone Careers
Bengaluru (Bangalore), Pune, Hyderabad
5 - 10 yrs
₹12L - ₹25L / yr
DevOps
skill iconPython
cicd
skill iconKubernetes
skill iconDocker
+1 more

Job Description

Experience: 5 - 9 years

Location: Bangalore/Pune/Hyderabad

Work Mode: Hybrid(3 Days WFO)


Senior Cloud Infrastructure Engineer for Data Platform 


The ideal candidate will play a critical role in designing, implementing, and maintaining cloud infrastructure and CI/CD pipelines to support scalable, secure, and efficient data and analytics solutions. This role requires a strong understanding of cloud-native technologies, DevOps best practices, and hands-on experience with Azure and Databricks.


Key Responsibilities:


Cloud Infrastructure Design & Management

Architect, deploy, and manage scalable and secure cloud infrastructure on Microsoft Azure.

Implement best practices for Azure Resource Management, including resource groups, virtual networks, and storage accounts.

Optimize cloud costs and ensure high availability and disaster recovery for critical systems


Databricks Platform Management

Set up, configure, and maintain Databricks workspaces for data engineering, machine learning, and analytics workloads.

Automate cluster management, job scheduling, and monitoring within Databricks.

Collaborate with data teams to optimize Databricks performance and ensure seamless integration with Azure services.


CI/CD Pipeline Development

Design and implement CI/CD pipelines for deploying infrastructure, applications, and data workflows using tools like Azure DevOps, GitHub Actions, or similar.

Automate testing, deployment, and monitoring processes to ensure rapid and reliable delivery of updates.


Monitoring & Incident Management

Implement monitoring and alerting solutions using tools like Dynatrace, Azure Monitor, Log Analytics, and Databricks metrics.

Troubleshoot and resolve infrastructure and application issues, ensuring minimal downtime.


Security & Compliance

Enforce security best practices, including identity and access management (IAM), encryption, and network security.

Ensure compliance with organizational and regulatory standards for data protection and cloud operations.


Collaboration & Documentation

Work closely with cross-functional teams, including data engineers, software developers, and business stakeholders, to align infrastructure with business needs.

Maintain comprehensive documentation for infrastructure, processes, and configurations.


Required Qualifications

Education: Bachelor’s degree in Computer Science, Engineering, or a related field.


Must Have Experience:

6+ years of experience in DevOps or Cloud Engineering roles.

Proven expertise in Microsoft Azure services, including Azure Data Lake, Azure Databricks, Azure Data Factory (ADF), Azure Functions, Azure Kubernetes Service (AKS), and Azure Active Directory.

Hands-on experience with Databricks for data engineering and analytics.


Technical Skills:

Proficiency in Infrastructure as Code (IaC) tools like Terraform, ARM templates, or Bicep.

Strong scripting skills in Python, or Bash.

Experience with containerization and orchestration tools like Docker and Kubernetes.

Familiarity with version control systems (e.g., Git) and CI/CD tools (e.g., Azure DevOps, GitHub Actions).


Soft Skills:

Strong problem-solving and analytical skills.

Excellent communication and collaboration abilities.

Read more
Netra Labs
Poonam Topagi
Posted by Poonam Topagi
Pune
2 - 6 yrs
₹7L - ₹15L / yr
FastAPI
Amazon EC2
skill iconMongoDB
skill iconRedis
Celery
+3 more

Role Description


This is a full-time on-site role for a Python Developer located in Pune. The Python Developer will be responsible for back-end web development, software development, and programming using Python. Day-to-day tasks include developing, testing, and maintaining scalable web applications and server-side logic, as well as optimizing performance and integrating user-facing elements with server-side logic. The role also demands collaboration with cross-functional teams to define, design, and ship new features.


Key Responsibilities

  • Lead the backend development team, ensuring best practices in coding, architecture, and performance optimization.
  • Design, develop, and maintain scalable backend services using Python and Fast API.
  • Architect and optimize databases, ensuring efficient storage and retrieval of data using MongoDB.
  • Integrate AI models and data science workflows into enterprise applications.
  • Implement and manage AWS cloud services, including Lambda, S3, EC2, and other AWS components.
  • Automate deployment pipelines using Jenkins and CI/CD best practices.
  • Ensure security and reliability, implementing best practices for authentication, authorization, and data privacy.
  • Monitor and troubleshoot system performance, optimizing infrastructure and codebase.
  • Collaborate with data scientists, front-end engineers, and product team to build AI-driven solutions.
  • Stay up to date with the latest technologies in AI, backend development, and cloud computing.


Required Skills & Qualifications

  • 3-4 years of experience in backend development with Python.
  • Strong experience in Fast API framework.
  • Proficiency in MongoDB or other NoSQL databases.
  • Hands-on experience with AWS services (Lambda, S3, EC2, etc.).
  • Experience with Jenkins and CI/CD pipelines.
  • Data Science knowledge with experience integrating AI models and data pipelines.
  • Strong understanding of RESTful API design, microservices, and event-driven architecture.
  • Experience in performance tuning, caching, and security best practices.
  • Proficiency in working with Docker and containerized applications.
Read more
Pune
6 - 8 yrs
₹45L - ₹50L / yr
skill iconPython
databricks
skill iconMachine Learning (ML)
Artificial Intelligence (AI)
CI/CD

We are looking for a Senior AI / ML Engineer to join our fast-growing team and help build AI-driven data platforms and intelligent solutions. If you are passionate about AI, data engineering, and building real-world GenAI systems, this role is for you!



🔧 Key Responsibilities

• Develop and deploy AI/ML models for real-world applications

• Build scalable pipelines for data processing, training, and evaluation

• Work on LLMs, RAG, embeddings, and agent workflows

• Collaborate with data engineers, product teams, and software developers

• Write clean, efficient Python code and ensure high-quality engineering practices

• Handle model monitoring, performance tuning, and documentation



Required Skills

• 2–5 years of experience in AI/ML engineering

• Strong knowledge of Python, TensorFlow/PyTorch

• Experience with LLMs, GenAI, RAG, or NLP

• Knowledge of Databricks, MLOps or cloud platforms (AWS/Azure/GCP)

• Good understanding of APIs, distributed systems, and data pipelines



🎯 Good to Have

• Experience in healthcare, SaaS, or big data

• Exposure to Databricks Mosaic AI

• Experience building AI agents

Read more
Bengaluru (Bangalore), Pune, Hyderabad, Noida
8 - 12 yrs
₹10L - ₹25L / yr
skill iconJava
skill iconSpring Boot
skill iconMongoDB
Windows Azure
skill iconReact.js

Job Description:


Senior Full Stack Developer (Java + React)

Experience: 7+ Years

Location: Bangalore, Hyderabad, Pune, Noida

Employment Type: Full-time

Preferred: Ready to join within 15 days - 30 days


🔍 About the Role

We are seeking a highly skilled Senior Full Stack Developer with strong backend expertise in Java and hands-on experience with React on the frontend. The ideal candidate should possess exceptional analytical skills, deep knowledge of software design principles, and the ability to build scalable, high-performance applications.

🧠 Key Responsibilities

  • Design, develop, and maintain scalable backend services using Core Java & Spring frameworks.
  • Build responsive and interactive UI components using ReactJS/Redux.
  • Implement high-quality code using TDD/BDD practices (JUnit, JBehave/Cucumber).
  • Work on RESTful API development, integration, and optimization.
  • Develop and manage efficient database schemas using SQL (DB2) and MongoDB.
  • Collaborate with cross-functional teams (DevOps, QA, Product) to deliver robust solutions.
  • Participate in code reviews, technical discussions, and architectural decisions.
  • Optimize system performance using multithreading, caching, and scalable design patterns.

🛠️ Required Skills

Backend (Strong Expertise Required)

  • 7+ years of experience in Java backend development
  • Deep knowledge of:
  • Core Java (class loading, garbage collection, collections, streams, reflections)
  • OOPs, data structures, algorithms, graph data
  • Design patterns, MVC, multithreading, recursion
  • Spring, JSR-303, Logback, Apache Commons

Database Skills

  • Strong knowledge of Relational Databases & SQL (DB2)
  • Good understanding of NoSQL (MongoDB)

Frontend Skills

  • Solid experience with ReactJS/Redux
  • Strong understanding of REST APIs, JSON, XML, HTTP

DevOps & Tools

  • Strong knowledge of Git, Gradle, Jenkins, CI/CD pipelines
  • Experience with Liquibase for schema management
  • Hands-on with Unix/Linux

✨ Good to Have

  • Experience with Azure, Snowflake, Databricks
  • Knowledge of Camunda 7/8 (BPMN/DMN)
  • Experience with TDD, BDD methodologies
  • Understanding of workflow engines & cloud data stack

🎓 Education

  • Bachelor’s degree in Computer Science, Engineering, or a related field.


Read more
Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Pune, Hyderabad, Chennai, Nagpur, Coimbatore, Jaipur, Kolkata, Ahmedabad
3.5 - 13 yrs
₹3L - ₹30L / yr
SailPoint
Identity management

We are looking for Developers, Technical Leads and Architects for SailPoint IDN & IIQ Platform. As a Security professional you will be responsible for defining requirements, designing & building security components, & testing SailPoint IDN & IIQ Platform.

Roles & Responsibilities:

1. Configuration, customization, and design SailPoint. 

2. Experience in virtual appliance (VA) concepts, Identity Profiles, cloud rules, 

3. Transformation rule, migration, and deployments. 

4. Ability to Setup, Troubleshoot Configure SailPoint integration with different systems. 

5. Able to quickly onboard application and migrating users. 

6. Ability to build java bean shell, workflow, JML, custom rules. 

7. Liaison with teams on delivery, helping them on issue technical issue, bugs fixing and enhancements

Professional & Technical Skills:

1. Should have the ability to understand customer requirements. 

2. knowledge on Integrating various platforms with SailPoint, such as Active Directory, HR apps, SAP Systems, Workday, AD, Azure O365, JDBC and Other cloud applications. 

3. Implementation knowledge on Access request customization. 

4. Hands on experience in Customization of Quicklink, User LCM, Certification, Custom workflows, forms, Rules, SailPoint IIQ API/ REST API etc. 

5. Implementation experience in Certification, Custom reports Auditing. 

6. Strong JAVA/J2EE Development Knowledge 

Read more
Ignite Solutions

at Ignite Solutions

6 recruiters
Eman Khan
Posted by Eman Khan
Remote, Pune
5 - 8 yrs
₹10L - ₹16L / yr
skill iconLaravel
skill iconVue.js
skill iconReact.js
skill iconPHP
skill iconJavascript

We are seeking an experienced Senior Full-Stack Developer with expertise in Laravel and Vue.js/React.js to join our dynamic development team. The ideal candidate will work on complex web application projects, collaborate with customer teams, and drive technical excellence.


Key Responsibilities

  • Design, develop, and maintain scalable web applications using Laravel
  • Build responsive, interactive front-end interfaces using Vue.js
  • Architect database schemas and optimize database performance
  • Write clean, maintainable, and well-documented code
  • Conduct code reviews and ensure adherence to coding standards
  • Troubleshoot and resolve complex technical issues
  • Collaborate with cross-functional teams including designers, product managers, and QA
  • Participate in sprint planning, daily standups, and retrospectives
  • Estimate project timelines and deliverables accurately
  • Ensure application security and performance optimization
  • Mentor junior and mid-level developers


Required Qualifications


Technical Skills

  • 5+ years of experience in web development
  • Expert-level proficiency in:
  • Laravel (8.x/9.x/10.x)
  • Vue.js (Vue 2 & Vue 3)
  • Strong knowledge of PHP 7.4+ and modern PHP practices
  • Proficiency in JavaScript (ES6+), HTML5, and CSS3
  • Experience with RESTful API development and consumption
  • Database expertise (MySQL, PostgreSQL)
  • Version control with Git


Additional Technical Requirements

  • Experience with build tools (Webpack, Vite, NPM/Yarn)
  • Knowledge of CSS preprocessors (Sass, Less) or CSS frameworks
  • Familiarity with testing frameworks (PHPUnit, Jest, Vue Test Utils)
  • Understanding of cloud platforms (AWS, Azure, or GCP)
  • Experience with containerization (Docker) is a plus


Soft Skills

  • Strong problem-solving and analytical thinking abilities
  • Excellent communication and collaboration skills
  • Ability to work independently and manage multiple priorities
  • Attention to detail and commitment to quality


Additional Qualifications

  • Experience with other PHP frameworks (Symfony, CodeIgniter)
  • Knowledge of additional frontend frameworks (React, Angular)
  • Familiarity with DevOps practices and CI/CD pipelines
  • Experience with microservices architecture
  • Previous experience in an Agile/Scrum environment


What We Offer

  • Competitive salary commensurate with experience
  • Flexible work arrangements (remote/hybrid options)
  • Professional development opportunities
  • Modern development tools and equipment
  • Collaborative and innovative work environment
Read more
Hunarstreet technologies pvt ltd

Hunarstreet technologies pvt ltd

Agency job
Pune
5 - 8 yrs
₹5L - ₹10L / yr
horeca sales
e commerce
quick commerce
modern trade
Sales
+1 more

Qualifications & Requirements

 Bachelor’s Degree required; MBA / Master’s in Marketing preferred.


 5–7 years of FMCG (Food) sales experience — experience in Modern Trade / E- commerce / Quick Commerce / HoReCa is essential.


 Strong leadership and team-building capabilities.

 Excellent communication, negotiation, and relationship management skills.

 Hands-on experience in Amazon / online retail operations preferred.

 Analytical mindset with sound knowledge of market trends, pricing, and brand visibility strategies.

Read more
Financial Services Industry

Financial Services Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
5 - 10 yrs
₹6L - ₹8L / yr
Operations management
Genesys
Avaya
DMS
Internal audit
+3 more

ROLES AND RESPONSIBILITIES:

The Manager/Deputy Unit Manager – Dialer Ops serves as the functional owner of dialer performance across DMS portfolios. The role independently manages end-to-end dialer operations, leads performance optimization, and drives analytical insights to strengthen calling effectiveness. This position acts as a bridge between tactical execution and strategic decision-making, ensuring that dialer efficiency, compliance, and penetration targets are consistently met. The role requires strong domain expertise, analytical rigor, and cross-functional collaboration.


KEY RESPONSIBILITIES:

Dialer Strategy & Execution:

  • Manage end-to-end execution of multiple dialer campaigns across various DMS delinquency buckets (X, 30+, 60+, 90+, etc.).
  • Develop and execute segmentation-based calling strategies as guided by the business and senior leadership.
  • Conduct deep-dive diagnostics on dialer logs, routing, pacing logic, and campaign-level performance.


Performance Optimization & Analytics:

  • Analyze daily/weekly dialer metrics and highlight performance trends, gaps, and improvement opportunities.
  • Implement dialing strategy enhancements—attempt logic, retry cycles, skill routing, and call distribution.


Compliance & Governance:

  • Conduct rigorous audits of calling lists, exclusion logic, DND filters, opt-outs, customer consent, and allowed calling windows.
  • Ensure adherence to regulatory guidelines, internal audit standards, and risk controls.
  • Maintain robust documentation of campaign changes, SOP updates, and version logs.


Cross-Functional Coordination:

  • Collaborate with Operations for workforce alignment, agent readiness, and campaign prioritization.
  • Work with Tech teams to resolve dialer issues, API failures, latency lags, and integration errors.
  • Partner with Quality and Policy teams to align operational execution with business rules.
  • Coordinate with MIS teams for data validation and report automation.


Reporting & Governance:

  • Prepare and publish daily/weekly/monthly MIS reports, dashboards, and analytical presentations.
  • Present dialer performance during weekly business reviews and highlight key insights or areas requiring action.
  • Track KPI dashboards for leadership, including connect %, agent utilization, penetration, drop %, and productivity metrics.


IDEAL CANDIDATE:

Domain & Technical Expertise:

  • Strong understanding of Collections, DMS frameworks, delinquency buckets, and customer treatment strategies.
  • Hands-on experience with enterprise-grade dialer systems (Genesys / Avaya / Aspect / Ameyo / NICE / Five9).
  • In-depth knowledge of dialer modes (predictive, preview, progressive), pacing logic, call blending, and routing principles.
  • Working knowledge of CRM systems, CTI integrations, and lead management platforms.


Analytical & Reporting Skills:

  • Advanced Excel skills (Power Query, Pivots, Index/Match, Nested formulas, Data modeling).
  • SQL proficiency preferred for data extraction, query building, and troubleshooting.
  • Ability to interpret large data sets and convert findings into actionable insights.


Execution & Problem-Solving Skills:

  • Strong troubleshooting ability to resolve dialer failures, configuration errors, and performance bottlenecks.
  • High execution discipline with the ability to manage multiple campaigns simultaneously.
  • Strong decision-making capabilities under operational pressure.


Communication & Stakeholder Management:

  • Ability to collaborate effectively with cross-functional teams (Ops, Tech, Analytics, Compliance).
  • Strong documentation, reporting, and presentation skills.
  • Ability to translate analytical findings into business-friendly recommendations.


Eligibility-

  • Open to employees in GB03A / GB03B as per mobility and internal movement guidelines.
  • Must have completed minimum tenure requirements as defined by HR.
  • No active performance improvement plan or disciplinary action.
  • Consistent performance ratings and adherence to compliance norms.


PERKS, BENEFITS AND WORK CULTURE:

Our people define our passion and our audacious, incredibly rewarding achievements. The company is one of India’s most diversified non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.

Read more
Banking Industry

Banking Industry

Agency job
via Jobdost by Saida Pathan
Mangalore, Mumbai, Pune, Bengaluru (Bangalore)
3 - 5 yrs
₹8L - ₹10L / yr
SQL
Dashboard
skill iconData Analytics
Database Development

Who is an ideal fit for us?

We seek professionals who are analytical, demonstrate self-motivation, exhibit a proactive mindset, and possess a strong sense of responsibility and ownership in their work.

 

What will you get to work on?


As a member of the Implementation & Analytics team, you will:

● Design, develop, and optimize complex SQL queries to extract, transform, and analyze data

● Create advanced reports and dashboards using SQL, stored procedures, and other reporting tools

● Develop and maintain database structures, stored procedures, functions, and triggers

● Optimize database performance by tuning SQL queries, and indexing to handle large datasets efficiently

● Collaborate with business stakeholders and analysts to understand analytics requirements

● Automate data extraction, transformation, and reporting processes to improve efficiency


What do we expect from you?


For the SQL/Oracle Developer role, we are seeking candidates with the following skills and Expertise:

● Proficiency in SQL (Window functions, stored procedures) and MS Excel (advanced Excel skills)

● More than 3 plus years of relevant experience

● Java / Python experience is a plus but not mandatory

● Strong communication skills to interact with customers to understand their requirements

● Capable of working independently with minimal guidance, showcasing self-reliance and initiative

● Previous experience in automation projects is preferred

● Work From Office: Bangalore/Navi Mumbai/Pune/Client locations


Read more
Client

at Client

2 candid answers
Harshit Matta
Posted by Harshit Matta
Bengaluru (Bangalore), Pune, Hyderabad, Jaipur, Noida
6 - 8 yrs
₹15L - ₹22L / yr
Salesforce development

Salesforce Tech Lead

6+ yrs 

Any 360 location - Hybrid

Can Join within 15-20 days


strong in LWC

Must have experience in Aura

Strong in Integration

Strong in Deployment

Must be able to handle complex data

Should be able to manage a team

Should understand Integration

Must be able to coordinate with multiple stakeholders

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Pune
1 - 2 yrs
₹4L - ₹6L / yr
Data engineering
PowerBI

Strong Data engineer profile

Mandatory (Experience 1): Must have 6 months+ of hands-on Data Engineering experience.

Mandatory (Experience 2): Must have end-to-end experience in building & maintaining ETL/ELT pipelines (not just BI/reporting).

Mandatory (Technical): Must have strong SQL capability

Preferred

Preferred (Experience): Worked on Call center data

Job Specific Criteria

CV Attachment is mandatory

Have you used Databricks or any notebook environment?

Have you worked on ETL/ELT workflow?

We have an alternate Saturdays working. Are you comfortable to WFH on 1st and 4th Saturday?

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Annie Varghese
Posted by Annie Varghese
Bengaluru (Bangalore), Pune
5 - 10 yrs
Best in industry
microsoft fabric
data lake
Data Warehouse (DWH)
data pipeline
Data engineering
+2 more

Role: Azure Fabric Data Engineer

Experience: 5–10 Years

Location: Pune/Bangalore

Employment Type: Full-Time

About the Role

We are looking for an experienced Azure Data Engineer with strong expertise in Microsoft Fabric and Power BI to build scalable data pipelines, Lakehouse architectures, and enterprise analytics solutions on the Azure cloud.

Key Responsibilities

  • Design & build data pipelines using Microsoft Fabric (Pipelines, Dataflows Gen2, Notebooks).
  • Develop and optimize Lakehouse / Data Lake / Delta Lake architectures.
  • Build ETL/ELT workflows using Fabric, Azure Data Factory, or Synapse.
  • Create and optimize Power BI datasets, data models, and DAX calculations.
  • Implement semantic models, incremental refresh, and Direct Lake/DirectQuery.
  • Work with Azure services: ADLS Gen2, Azure SQL, Synapse, Event Hub, Functions, Databricks.
  • Build dimensional models (Star/Snowflake) and support BI teams.
  • Ensure data governance & security using Purview, RBAC, and AAD.

Required Skills

  • Strong hands-on experience with Microsoft Fabric (Lakehouse, Pipelines, Dataflows, Notebooks).
  • Expertise in Power BI (DAX, modeling, Dataflows, optimized datasets).
  • Deep knowledge of Azure Data Engineering stack (ADF, ADLS, Synapse, SQL).
  • Strong SQL, Python/PySpark skills.
  • Experience in Delta Lake, Medallion architecture, and data quality frameworks.

Nice to Have

  • Azure Certifications (DP-203, PL-300, Fabric Analytics Engineer).
  • Experience with CI/CD (Azure DevOps/GitHub).
  • Databricks experience (preferred).


Note: One Technical round is mandatory to be taken F2F from either Pune or Bangalore office

Read more
One2n

at One2n

3 candid answers
Reshika Mendiratta
Posted by Reshika Mendiratta
Pune
6yrs+
Upto ₹35L / yr (Varies
)
skill iconKubernetes
Monitoring
skill iconAmazon Web Services (AWS)
JVM
skill iconDocker
+7 more

About the role:

We are looking for a Senior Site Reliability Engineer who understands the nuances of production systems. If you care about building and running reliable software systems in production, you'll like working at One2N.

You will primarily work with our startups and mid-size clients. We work on One-to-N kind problems (hence the name One2N), those where Proof of concept is done and the work revolves around scalability, maintainability, and reliability. In this role, you will be responsible for architecting and optimizing our observability and infrastructure to provide actionable insights into performance and reliability.


Responsibilities:

  • Conceptualise, think, and build platform engineering solutions with a self-serve model to enable product engineering teams.
  • Provide technical guidance and mentorship to young engineers.
  • Participate in code reviews and contribute to best practices for development and operations.
  • Design and implement comprehensive monitoring, logging, and alerting solutions to collect, analyze, and visualize data (metrics, logs, traces) from diverse sources.
  • Develop custom monitoring metrics, dashboards, and reports to track key performance indicators (KPIs), detect anomalies, and troubleshoot issues proactively.
  • Improve Developer Experience (DX) to help engineers improve their productivity.
  • Design and implement CI/CD solutions to optimize velocity and shorten the delivery time.
  • Help SRE teams set up on-call rosters and coach them for effective on-call management.
  • Automating repetitive manual tasks from CI/CD pipelines, operations tasks, and infrastructure as code (IaC) practices.
  • Stay up-to-date with emerging technologies and industry trends in cloud-native, observability, and platform engineering space.


Requirements:

  • 6-9 years of professional experience in DevOps practices or software engineering roles, with a focus on Kubernetes on an AWS platform.
  • Expertise in observability and telemetry tools and practices, including hands-on experience with some of Datadog, Honeycomb, ELK, Grafana, and Prometheus.
  • Working knowledge of programming using Golang, Python, Java, or equivalent.
  • Skilled in diagnosing and resolving Linux operating system issues.
  • Strong proficiency in scripting and automation to build monitoring and analytics solutions.
  • Solid understanding of microservices architecture, containerization (Docker, Kubernetes), and cloud-native technologies.
  • Experience with infrastructure as code (IaC) tools such as Terraform, Pulumi.
  • Excellent analytical and problem-solving skills, keen attention to detail, and a passion for continuous improvement.
  • Strong written, communication, and collaboration skills, with the ability to work effectively in a fast-paced, agile environment.
Read more
Financial Services Company

Financial Services Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
4 - 8 yrs
₹10L - ₹13L / yr
SQL
databricks
PowerBI
Windows Azure
Data engineering
+9 more

Review Criteria

  • Strong Senior Data Engineer profile
  • 4+ years of hands-on Data Engineering experience
  • Must have experience owning end-to-end data architecture and complex pipelines
  • Must have advanced SQL capability (complex queries, large datasets, optimization)
  • Must have strong Databricks hands-on experience
  • Must be able to architect solutions, troubleshoot complex data issues, and work independently
  • Must have Power BI integration experience
  • CTC has 80% fixed and 20% variable in their ctc structure


Preferred

  • Worked on Call center data, understand nuances of data generated in call centers
  • Experience implementing data governance, quality checks, or lineage frameworks
  • Experience with orchestration tools (Airflow, ADF, Glue Workflows), Python, Delta Lake, Lakehouse architecture


Job Specific Criteria

  • CV Attachment is mandatory
  • Are you Comfortable integrating with Power BI datasets?
  • We have an alternate Saturdays working. Are you comfortable to WFH on 1st and 4th Saturday?


Role & Responsibilities

We are seeking a highly experienced Senior Data Engineer with strong architectural capability, excellent optimisation skills, and deep hands-on experience in modern data platforms. The ideal candidate will have advanced SQL skills, strong expertise in Databricks, and practical experience working across cloud environments such as AWS and Azure. This role requires end-to-end ownership of complex data engineering initiatives, including architecture design, data governance implementation, and performance optimisation. You will collaborate with cross-functional teams to build scalable, secure, and high-quality data solutions.

 

Key Responsibilities-

  • Lead the design and implementation of scalable data architectures, pipelines, and integration frameworks.
  • Develop, optimise, and maintain complex SQL queries, transformations, and Databricks-based data workflows.
  • Architect and deliver high-performance ETL/ELT processes across cloud platforms.
  • Implement and enforce data governance standards, including data quality, lineage, and access control.
  • Partner with analytics, BI (Power BI), and business teams to enable reliable, governed, and high-value data delivery.
  • Optimise large-scale data processing, ensuring efficiency, reliability, and cost-effectiveness.
  • Monitor, troubleshoot, and continuously improve data pipelines and platform performance.
  • Mentor junior engineers and contribute to engineering best practices, standards, and documentation.


Ideal Candidate

  • Proven industry experience as a Senior Data Engineer, with ownership of high-complexity projects.
  • Advanced SQL skills with experience handling large, complex datasets.
  • Strong expertise with Databricks for data engineering workloads.
  • Hands-on experience with major cloud platforms — AWS and Azure.
  • Deep understanding of data architecture, data modelling, and optimisation techniques.
  • Familiarity with BI and reporting environments such as Power BI.
  • Strong analytical and problem-solving abilities with a focus on data quality and governance
  • Proficiency in python or another programming language in a plus.
Read more
Non-Banking Financial Company

Non-Banking Financial Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
4 - 8 yrs
₹8L - ₹13L / yr
SQL
databricks
PowerBI
Data engineering
Data architecture
+7 more

ROLES AND RESPONSIBILITIES:

We are seeking a highly experienced Senior Data Engineer with strong architectural capability, excellent optimisation skills, and deep hands-on experience in modern data platforms. The ideal candidate will have advanced SQL skills, strong expertise in Databricks, and practical experience working across cloud environments such as AWS and Azure. This role requires end-to-end ownership of complex data engineering initiatives, including architecture design, data governance implementation, and performance optimisation. You will collaborate with cross-functional teams to build scalable, secure, and high-quality data solutions.


Key Responsibilities-

  • Lead the design and implementation of scalable data architectures, pipelines, and integration frameworks.
  • Develop, optimise, and maintain complex SQL queries, transformations, and Databricks-based data workflows.
  • Architect and deliver high-performance ETL/ELT processes across cloud platforms.
  • Implement and enforce data governance standards, including data quality, lineage, and access control.
  • Partner with analytics, BI (Power BI), and business teams to enable reliable, governed, and high-value data delivery.
  • Optimise large-scale data processing, ensuring efficiency, reliability, and cost-effectiveness.
  • Monitor, troubleshoot, and continuously improve data pipelines and platform performance.
  • Mentor junior engineers and contribute to engineering best practices, standards, and documentation.


IDEAL CANDIDATE:

  • Proven industry experience as a Senior Data Engineer, with ownership of high-complexity projects.
  • Advanced SQL skills with experience handling large, complex datasets.
  • Strong expertise with Databricks for data engineering workloads.
  • Hands-on experience with major cloud platforms — AWS and Azure.
  • Deep understanding of data architecture, data modelling, and optimisation techniques.
  • Familiarity with BI and reporting environments such as Power BI.
  • Strong analytical and problem-solving abilities with a focus on data quality and governance
  • Proficiency in python or another programming language in a plus.


PERKS, BENEFITS AND WORK CULTURE:

Our people define our passion and our audacious, incredibly rewarding achievements. The company is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.

Read more
Non-Banking Financial Company

Non-Banking Financial Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
1 - 2 yrs
₹5L - ₹6.1L / yr
SQL
databricks
PowerBI
Data engineering
ETL
+6 more

ROLES AND RESPONSIBILITIES:

We are looking for a Junior Data Engineer who will work under guidance to support data engineering tasks, perform basic coding, and actively learn modern data platforms and tools. The ideal candidate should have foundational SQL knowledge, basic exposure to Databricks. This role is designed for early-career professionals who are eager to grow into full data engineering responsibilities while contributing to data pipeline operations and analytical support.


Key Responsibilities-

  • Support the development and maintenance of data pipelines and ETL/ELT workflows under mentorship.
  • Write basic SQL queries, transformations, and assist with Databricks notebook tasks.
  • Help troubleshoot data issues and contribute to ensuring pipeline reliability.
  • Work with senior engineers and analysts to understand data requirements and deliver small tasks.
  • Assist in maintaining documentation, data dictionaries, and process notes.
  • Learn and apply data engineering best practices, coding standards, and cloud fundamentals.
  • Support basic tasks related to Power BI data preparation or integrations as needed.


IDEAL CANDIDATE:

  • Foundational SQL skills with the ability to write and understand basic queries.
  • Basic exposure to Databricks, data transformation concepts, or similar data tools.
  • Understanding of ETL/ELT concepts, data structures, and analytical workflows.
  • Eagerness to learn modern data engineering tools, technologies, and best practices.
  • Strong problem-solving attitude and willingness to work under guidance.
  • Good communication and collaboration skills to work with senior engineers and analysts.


PERKS, BENEFITS AND WORK CULTURE:

Our people define our passion and our audacious, incredibly rewarding achievements. Bajaj Finance Limited is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Pune
4 - 8 yrs
₹10L - ₹15L / yr
Data engineering
PowerBI
  • Strong Senior Data Engineer profile
  • Mandatory (Experience 1): Must have 4+ years of hands-on Data Engineering experience
  • Mandatory (Experience 2): Must have experience owning end-to-end data architecture and complex pipelines
  • Mandatory (Technical 1): Must have advanced SQL capability (complex queries, large datasets, optimization)
  • Mandatory (Technical 2): Must have strong Databricks hands-on experience
  • Mandatory (Role Requirement): Must be able to architect solutions, troubleshoot complex data issues, and work independently
  • Mandatory (BI Requirement): Must have Power BI integration experience
  • Mandatory (Note): Bajaj CTC has 80% fixed and 20% variable


Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Pune
1 - 2 yrs
₹4L - ₹7L / yr
Data engineering

Strong Data engineer profile

Mandatory (Experience 1): Must have 6 months+ of hands-on Data Engineering experience.

Mandatory (Experience 2): Must have end-to-end experience in building & maintaining ETL/ELT pipelines (not just BI/reporting).

Mandatory (Technical): Must have strong SQL capability

Read more
Non-Banking Financial Company

Non-Banking Financial Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
3 - 5 yrs
₹7L - ₹9L / yr
Video production
Video Editing
Content Management System (CMS)
Videography
Video Making
+15 more

ROLES AND RESPONSIBILITIES:

Video-led Content Strategy:

  • Implement the video-led content strategy to meet business objectives like increase in views, leads, product awareness and CTR
  • Execute the video content framework that talks to a varied TG, mixes formats and languages (English and vernacular)


Production:

  • Ability to write/edit clear and concise copy and briefs unique to the platform, and influence/direct designers and agencies for creative output
  • Creating the monthly production pipeline, review and edit every piece of video content that is produced
  • Explore AI-based and production automation tools to help create communication stimuli at scale
  • Manage our agency and partner ecosystem to support high-scale video production
  • Manage the monthly production flow and maintain data sheets to enable ease of tracking
  • Increase CTR, Views and other critical metrics for all video production


Project Management:

  • Oversee the creation of various video formats, including explainer videos, product demos, customer testimonials, etc.
  • Plan and manage video production schedules, ensuring timely delivery of projects within budget
  • Upload and manage video content across multiple digital platforms, including the digital platforms and other relevant platforms
  • Ensure all video content is optimised for each platform, following best practices for SEO and audience engagement
  • Coordinate with the content team to integrate video content on the platforms
  • Maintain an archive of video assets and ensure proper documentation and tagging


Capabilities:

  • Drive the development of capabilities around production, automation, upload, thereby leading to a reduction in TAT and effort
  • Work with technology teams to explore Gen AI tools to deliver output at scale and speed
  • Identifying opportunities for new formats and keeping up with trends in the video content space


Customer obsession and governance:

  • Relentless focus on making customer interactions non-intrusive; using video content to create a frictionless experience
  • Zero tolerance for content and communication errors
  • Develop a comprehensive video guidelines framework that is easy to use by businesses yet creates a distinct identity for the brand
  • Have a strong eye for grammar and ensure that every content unit adheres to the brand tone of voice
  • Create checks and balances in the system so that all customer-facing content is first time right, every time


Performance tracking:

  • Tracking and analysing production, go-live status and engagement metrics using tools like Google Analytics, etc.
  • Gauge efficacy of video content produced, and drive changes wherever needed
  • Provide regular reports on video performance, identifying trends, insights, and areas for improvement


IDEAL CANDIDATE:

Qualifications:

  • Bachelors degree in Communications, Digital Marketing, Advertising or a related field
  • Proven experience as a creative/content writer or in a similar role, preferably with exposure to AI-driven content creation.


Work Experience:

  • 3-5 years of relevant experience in the space of content marketing/advertising, experience in Digital Marketing with a focus on video content, will be an advantage


Skills :

  • Excellent command over the English language
  • Hands-on experience of copywriting, editing, and creating communication
  • Ability to handle complex briefs and ideate out of the box
  • Creative thinking and problem-solving skills, with a passion for storytelling and visual communication
  • Deep customer focus by understanding customer behaviour and analysing data & real-world experiences
  • Detailed orientation & very structured thinking, think of customers' entire journey and experience
  • Strong communication and collaboration skills to effectively work with diverse teams
  • Passion for emerging technologies and the ability to adapt to a fast-paced and evolving environment
  • Excellent project management skills, with the ability to manage multiple projects simultaneously and meet tight deadlines
  • Proficiency in AI tools and video editing software (e.g., Adobe Premiere Pro, Final Cut Pro) and familiarity with graphic design software (e.g., Adobe After Effects, Photoshop)


PERKS, BENEFITS AND WORK CULTURE:

Our people define our passion and our audacious, incredibly rewarding achievements. Bajaj Finance Limited is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.

Read more
Financial Services Company

Financial Services Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
2 - 5 yrs
₹8L - ₹10.7L / yr
SQL Azure
databricks
ETL
SQL
Data modeling
+4 more

ROLES AND RESPONSIBILITIES:

We are seeking a skilled Data Engineer who can work independently on data pipeline development, troubleshooting, and optimisation tasks. The ideal candidate will have strong SQL skills, hands-on experience with Databricks, and familiarity with cloud platforms such as AWS and Azure. You will be responsible for building and maintaining reliable data workflows, supporting analytical teams, and ensuring high-quality, secure, and accessible data across the organisation.


KEY RESPONSIBILITIES:

  • Design, develop, and maintain scalable data pipelines and ETL/ELT workflows.
  • Build, optimise, and troubleshoot SQL queries, transformations, and Databricks data processes.
  • Work with large datasets to deliver efficient, reliable, and high-performing data solutions.
  • Collaborate closely with analysts, data scientists, and business teams to support data requirements.
  • Ensure data quality, availability, and security across systems and workflows.
  • Monitor pipeline performance, diagnose issues, and implement improvements.
  • Contribute to documentation, standards, and best practices for data engineering processes.


IDEAL CANDIDATE:

  • Proven experience as a Data Engineer or in a similar data-focused role (3+ years).
  • Strong SQL skills with experience writing and optimising complex queries.
  • Hands-on experience with Databricks for data engineering tasks.
  • Experience with cloud platforms such as AWS and Azure.
  • Understanding of ETL/ELT concepts, data modelling, and pipeline orchestration.
  • Familiarity with Power BI and data integration with BI tools.
  • Strong analytical and troubleshooting skills, with the ability to work independently.
  • Experience working end-to-end on data engineering workflows and solutions.


PERKS, BENEFITS AND WORK CULTURE:

Our people define our passion and our audacious, incredibly rewarding achievements. The company is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Pune
2 - 5 yrs
₹8L - ₹11L / yr
Data modeling
ETL

Strong Data engineer profile

Mandatory (Experience 1): Must have 2+ years of hands-on Data Engineering experience.

Mandatory (Experience 2): Must have end-to-end experience in building & maintaining ETL/ELT pipelines (not just BI/reporting).

Mandatory (Technical 1): Must have strong SQL capability (complex queries + optimization).

Mandatory (Technical 2): Must have hands-on Databricks experience.

Mandatory (Role Requirement): Must be able to work independently, troubleshoot data issues, and manage large datasets.

Read more
Leapfrog

at Leapfrog

4 candid answers
1 recruiter
Bisman Gill
Posted by Bisman Gill
Pune
6yrs+
Upto ₹38L / yr (Varies
)
skill iconPython
Computer Vision
Natural Language Processing (NLP)
NumPy
Scikit-Learn
+2 more

As a Lead Artificial Intelligence Engineer at Leapfrog Technology, you will be at the forefront of shaping the future of data-driven solutions. You'll lead a talented team, drive the development of innovative AI projects, and work collaboratively across functions to turn complex business challenges into actionable insights.


Key Responsibilities:

● Leadership Excellence: Lead and inspire a team of AI Engineers and Data Scientists, fostering a culture of innovation, collaboration, and continuous growth.

● End-to-End Ownership: Take full ownership of the AI project lifecycle, from ideation and design to development, deployment, and maintenance.

● Technological Innovation: Explore and assess emerging technologies to enhance the performance, maintainability, and reliability of AI systems.

● Engineering Best Practices: Apply robust software engineering practices to AI, including CI/CD pipelines, automation, and quality assurance.

● Architectural Leadership: Collaborate with technology experts to make informed architectural decisions, and ensure thorough technical documentation.

● Risk Mitigation: Proactively identify and address project risks, conduct root cause analysis, and implement preventive measures.

● Cross-functional Collaboration: Engage closely with cross-functional teams, including business stakeholders, product managers, software engineers, and data engineers, to deliver impactful data-driven solutions.

● Continuous Learning: Stay at the cutting edge of data science, ML, and AI developments, and leverage emerging technologies to solve complex business problems.

● Mentorship and Growth: Coach and motivate team members, identify training needs, and foster their professional development.

● Organizational Excellence: Actively uphold and promote the company's culture, processes, and standards to ensure consistent excellence in our work.


Job requirements

Education and Experience:

  • A degree (Masters preferred) in Computer Science, Engineering, Artificial Intelligence, Data Science, Applied Mathematics, or related fields.
  • Minimum 6+ years of hands-on experience in AI/ML or Data Science, preferably in real industry settings, with a track record of building data products that have positively impacted customer satisfaction and revenue growth.

Technical Skills:

  • Proficiency in a wide range of Machine Learning techniques and algorithms, with the ability to apply advanced analytics methods, including Bayesian statistics, clustering, text analysis, time series analysis, and neural networks on large-scale datasets.
  • Expertise in at least one specialized area of application, such as Computer Vision or Natural Language Processing (NLP). (NLP Expertise preferred)
  •  Strong programming skills in Python, including expertise in the data ecosystem (Numpy, Scipy, Pandas, etc.) or equivalent skills in languages like R, Java, Scala, or Julia, with a focus on producing production-quality code.
  • Hands-on experience with popular ML frameworks like Scikit-Learn, PyTorch, or TensorFlow.
  • Expertise with Generative AI and Large Language Models (LLMs) along with their implementation in real-life applications.
  • Experience building end-to-end ML systems (MLOps).
  • Experience in deploying code in web frameworks such as Flask, FastAPI or Django.
  • Experience working in a cloud environment like AWS, Azure, or GCP for ML work.
  • Good grasp of SQL/NoSQL databases and scripting skills, particularly within analytics platforms and data warehouses.
  • Good grasp of software engineering concepts (SDLC, Version Control, CI/CD, Containerization, Scalability and so on), programming concepts, and tools/platforms like Git and Docker.
  • Bonus: Experience with Big Data technologies such as Apache Spark, Kafka, Kinesis, and cloud-based ML platforms like AWS SageMaker or GCP ML Engine.
  • Bonus: Experience with data visualization and dashboard tools like Tableau or Power BI.

    Soft Skills:

  • Highly motivated, self-driven, entrepreneurial mindset, and capable of solving complex analytical problems under high-pressure situations.
  • Ability to work with cross-functional and cross-regional teams.
  • Ability to lead a team of Data/AI professionals and work with senior management, technological experts, and the product team.
  • Excellent written and verbal communication skills, comfortable with client communication.
  • Good leadership skills - ability to motivate and mentor team members, ability to plan and make sound decisions, ability to negotiate tactfully with the client and team.
  • Results-oriented, customer-focused with a passion for resolving tough technical and operational challenges.
  • Possess excellent analytical and problem-solving abilities.
  • Good documentation skills.
  • Experienced with Agile methodologies like Scrum/Kanban


Read more
Pune
2 - 6 yrs
₹0.5L - ₹4.5L / yr
Paid Marketing
performance marketing campaigns
Google Ads
Meta Ads
LinkedIn Ads
+15 more

🔹 About the Role:


Title: Paid Marketing Executive — Mid-level (2+ years experience)

We’re looking for a data-driven Paid Marketing Executive to plan, execute, and optimize performance marketing campaigns across Google Ads, Meta Ads, and LinkedIn Ads.

The ideal candidate will have hands-on experience in lead generation, campaign analytics, and ROI optimization, with the ability to turn insights into impactful growth strategies


🎯 Key Responsibilities:

  • Plan, launch, and optimize campaigns across Google, Meta, and LinkedIn Ads.
  • Manage end-to-end campaign execution — targeting, ad setup, A/B testing, landing page optimization, and reporting.
  • Drive lead generation and user acquisition while maintaining low CPA and high ROAS.
  • Run remarketing, retargeting, and lookalike campaigns to improve conversion rates.
  • Implement A/B testing for creatives, copies, and funnels to enhance performance.
  • Utilize AI tools and automation to streamline campaign management and reporting.
  • Monitor and analyze KPIs (CTR, CPC, CPA, ROAS, Conversion Rates) for ongoing performance improvement.
  • Prepare and present reports using Google Analytics, GTM, Meta Ads Manager, and LinkedIn Campaign Manager.


🧩 Requirements:

  • Minimum 2 years of experience in Performance Marketing / Digital Marketing (preferably B2B or Finance sector).
  • Hands-on experience with Google Ads, Meta Ads, and LinkedIn Ads.
  • Strong analytical mindset with experience in funnel optimization and A/B testing.
  • Proficiency in analytics tools – Google Analytics, GTM, and Ads Managers.
  • Ability to manage budgets effectively and deliver measurable ROI.
  • Excellent communication and collaboration skills.
  • Bachelor’s degree in Marketing, Business, or a related field.
  • Certifications (Google Ads / Facebook Blueprint) are a plus.


📈 If you’re passionate about paid campaigns, analytics, and driving digital growth — this role is for you!


Join our dynamic marketing team and make an impact with data-backed decisions.

Read more
Vijay Sales
Tech Recruiter
Posted by Tech Recruiter
Pune
1 - 6 yrs
₹6L - ₹15L / yr
databricks
SQL
skill iconPython
Artificial Intelligence (AI)
Generative AI
+1 more

About Vijay Sales

Vijay Sales is one of India’s leading electronics retail brands with 160+ stores nationwide and a fast-growing digital presence. We are on a mission to build the most advanced data-driven retail intelligence ecosystem—using AI, predictive analytics, LLMs, and real-time automation to transform customer experience, supply chain, and omnichannel operations.


Role Overview

We are looking for a highly capable AI Engineer who is passionate about building production-grade AI systems, designing scalable ML architecture, and working with cutting-edge AI/ML tools. This role involves hands-on work with Databricks, SQL, PySpark, modern LLM/GenAI frameworks, and full lifecycle ML system design.


Key Responsibilities


Machine Learning & AI Development

  • Build, train, and optimize ML models for forecasting, recommendation, personalization, churn prediction, inventory optimization, anomaly detection, and pricing intelligence.
  • Develop GenAI solutions using modern LLM frameworks (e.g., LangChain, LlamaIndex, HuggingFace Transformers).
  • Explore and implement RAG (Retrieval Augmented Generation) pipelines for product search, customer assistance, and support automation.
  • Fine-tune LLMs on company-specific product and sales datasets (using QLoRA, PEFT, and Transformers).
  • Develop scalable feature engineering pipelines leveraging Delta Lake and Databricks Feature Store.

Databricks / Data Engineering

  • Build end-to-end ML workflows on Databricks using PySpark, MLflow, Unity Catalog, Delta Live Tables.
  • Optimize Databricks clusters for cost, speed, and stability.
  • Maintain reusable notebooks and parameterized pipelines for model ingestion, validation, and deployment.
  • Use MLflow for tracking experiments, model registry, and lifecycle management.

Data Handling & SQL

  • Write advanced SQL for multi-source data exploration, aggregation, and anomaly detection.
  • Work on large, complex datasets from ERP, POS, CRM, Website, and Supply Chain systems.
  • Automate ingestion of streaming and batch data into Databricks pipelines.

Deployment & MLOps

  • Deploy ML models using REST APIs, Databricks Model Serving, Docker, or cloud-native endpoints.
  • Build CI/CD pipelines for ML using GitHub Actions, Azure DevOps, or Databricks Workflows.
  • Implement model monitoring for drift, accuracy decay, and real-time alerts.
  • Maintain GPU/CPU environments for training workflows.

Must-Have Technical Skills

Core AI/ML

  • Strong fundamentals in machine learning: regression, classification, time-series forecasting, clustering.
  • Experience in deep learning using PyTorch or TensorFlow/Keras.
  • Expertise in LLMs, embeddings, vector databases, and GenAI architecture.
  • Hands-on experience with HuggingFace, embedding models, and RAG.

Databricks & Big Data

  • Hands-on experience with Databricks (PySpark, SQL, Delta Lake, MLflow, Feature Store).
  • Strong understanding of Spark execution, partitioning, and optimization.

Programming

  • Strong proficiency in Python.
  • Experience writing high-performance SQL with window functions, CTEs, and analytical queries.
  • Knowledge of Git, CI/CD, REST APIs, and Docker.

MLOps & Production Engineering

  • Experience deploying models to production and monitoring them.
  • Familiarity with tools like MLflow, Weights & Biases, or SageMaker equivalents.
  • Experience in building automated training pipelines and handling model drift/feedback loops.

Preferred Domain Experience

  • Retail/e-commerce analytics
  • Demand forecasting
  • Inventory optimization
  • Customer segmentation & personalization
  • Price elasticity and competitive pricing


Read more
Pune
1 - 3 yrs
₹2L - ₹5L / yr
Lead Generation
Cold Calling

Job Title: Inside Sales Specialist

Location: Baner, Pune

Experience: 1–3 Years

Employment Type: Full-Time | WFO

About the Role:

We are looking for a Inside Sales Specialist with proven experience in identifying, nurturing, and converting qualified leads through outbound channels. The ideal candidate will have hands-on experience in email campaigns, LinkedIn outreach, and cold calling targeting international markets such as DACH, BENELUX, and the UK.

Key Responsibilities:

  • Generate qualified B2B leads through cold calls, emails, and LinkedIn campaigns.
  • Execute targeted email and LinkedIn outreach campaigns to build a strong sales pipeline.
  • Manage follow-ups, schedule meetings, and support client acquisition activities.
  • Maintain CRM data and provide weekly reports on lead quality and conversions.
  • Collaborate with the sales and marketing team to improve campaign strategies and conversion rates.
  • Ensure a minimum of 40 qualified leads per month are generated through outbound efforts.
  • Identify new markets, decision-makers, and business opportunities across international regions.

Required Skills & Experience:

  • 1–3 years of proven experience in B2B Lead Generation / Inside Sales.
  • Strong communication and persuasion skills in English.
  • Experience with international lead generation (especially DACH, BENELUX & UK markets).
  • Proficiency in using LinkedIn Sales Navigator, CRM tools (HubSpot, Zoho, etc.), and email campaign tools.
  • Ability to research, segment, and target the right prospects.
  • Self-motivated, goal-oriented, and comfortable working with monthly lead targets.


Read more
Mckinely and rice
Pune, Noida
5 - 15 yrs
₹5L - ₹25L / yr
skill iconMongoDB
skill iconNodeJS (Node.js)
Generative AI
skill iconExpress
DevOps
+2 more

Company Overview 

McKinley Rice is not just a company; it's a dynamic community, the next evolutionary step in professional development. Spiritually, we're a hub where individuals and companies converge to unleash their full potential. Organizationally, we are a conglomerate composed of various entities, each contributing to the larger narrative of global excellence.

Redrob by McKinley Rice: Redefining Prospecting in the Modern Sales Era


Backed by a $40 million Series A funding from leading Korean & US VCs, Redrob is building the next frontier in global outbound sales. We’re not just another database—we’re a platform designed to eliminate the chaos of traditional prospecting. In a world where sales leaders chase meetings and deals through outdated CRMs, fragmented tools, and costly lead-gen platforms, Redrob provides a unified solution that brings everything under one roof.

Inspired by the breakthroughs of Salesforce, LinkedIn, and HubSpot, we’re creating a future where anyone, not just enterprise giants, can access real-time, high-quality data on 700 M+ decision-makers, all in just a few clicks.

At Redrob, we believe the way businesses find and engage prospects is broken. Sales teams deserve better than recycled data, clunky workflows, and opaque credit-based systems. That’s why we’ve built a seamless engine for:

  • Precision prospecting
  • Intent-based targeting
  • Data enrichment from 16+ premium sources
  • AI-driven workflows to book more meetings, faster

We’re not just streamlining outbound—we’re making it smarter, scalable, and accessible. Whether you’re an ambitious startup or a scaled SaaS company, Redrob is your growth copilot for unlocking warm conversations with the right people, globally.



EXPERIENCE



Duties you'll be entrusted with:


  • Develop and execute scalable APIs and applications using the Node.js or Nest.js framework
  • Writing efficient, reusable, testable, and scalable code.
  • Understanding, analyzing, and implementing – Business needs, feature modification requests, and conversion into software components
  • Integration of user-oriented elements into different applications, data storage solutions
  • Developing – Backend components to enhance performance and receptiveness, server-side logic, and platform, statistical learning models, highly responsive web applications
  • Designing and implementing – High availability and low latency applications, data protection and security features
  • Performance tuning and automation of applications and enhancing the functionalities of current software systems.
  • Keeping abreast with the latest technology and trends.


Expectations from you:


Basic Requirements


  • Minimum qualification: Bachelor’s degree or more in Computer Science, Software Engineering, Artificial Intelligence, or a related field.
  • Experience with Cloud platforms (AWS, Azure, GCP).
  • Strong understanding of monitoring, logging, and observability practices.
  • Experience with event-driven architectures (e.g., Kafka, RabbitMQ).
  • Expertise in designing, implementing, and optimizing Elasticsearch.
  • Work with modern tools including Jira, Slack, GitHub, Google Docs, etc.
  • Expertise in Event driven architecture.
  • Experience in Integrating Generative AI APIs.
  • Working experience in high user concurrency.
  • Experience in scaled databases for handling millions of records - indexing, retrieval, etc.,


Technical Skills


  • Demonstrable experience in web application development with expertise in Node.js or Nest.js.
  • Knowledge of database technologies and agile development methodologies.
  • Experience working with databases, such as MySQL or MongoDB.
  • Familiarity with web development frameworks, such as Express.js.
  • Understanding of microservices architecture and DevOps principles.
  • Well-versed with AWS and serverless architecture.



Soft Skills


  • A quick and critical thinker with the ability to come up with a number of ideas about a topic and bring fresh and innovative ideas to the table to enhance the visual impact of our content.
  • Potential to apply innovative and exciting ideas, concepts, and technologies.
  • Stay up-to-date with the latest design trends, animation techniques, and software advancements.
  • Multi-tasking and time-management skills, with the ability to prioritize tasks.


THRIVE


Some of the extensive benefits of being part of our team:


  • We offer skill enhancement and educational reimbursement opportunities to help you further develop your expertise.
  • The Member Reward Program provides an opportunity for you to earn up to INR 85,000 as an annual Performance Bonus.
  • The McKinley Cares Program has a wide range of benefits:
  • The wellness program covers sessions for mental wellness, and fitness and offers health insurance.
  • In-house benefits have a referral bonus window and sponsored social functions.
  • An Expanded Leave Basket including paid Maternity and Paternity Leaves and rejuvenation Leaves apart from the regular 20 leaves per annum. 
  • Our Family Support benefits not only include maternity and paternity leaves but also extend to provide childcare benefits.
  • In addition to the retention bonus, our McKinley Retention Benefits program also includes a Leave Travel Allowance program.
  • We also offer an exclusive McKinley Loan Program designed to assist our employees during challenging times and alleviate financial burdens.


Read more
Hummingbird Web Solutions Private Limited
Pune
6 - 12 yrs
₹12L - ₹18L / yr
User Interface (UI) Design
Graphic Designing
User Experience (UX) Design

We are looking for a Senior Lead Designer at Hummingbird, you will guide and mentor a team of Graphic and UI/UX designers, ensuring the delivery of high-quality, impactful designs that reflect our brand and enhance user experience.


  • Design Leadership: Lead, mentor, and guide the design team while maintaining a high standard of creative excellence.
  • Graphic Design: Produce visually compelling graphics and illustrations aligned with brand guidelines across various platforms.
  • UI/UX Design: Create intuitive, user-centric interfaces and seamless digital experiences.
  • Tool Proficiency: Work with tools such as Figma and Adobe Illustrator to design wireframes, prototypes, and final visuals.
  • Communication: Clearly present design concepts, decisions, and feedback to team members and stakeholders.
  • Team Motivation: Inspire and encourage the design team to foster a collaborative and innovative environment.
  • Project Management: Manage multiple design projects to ensure timely delivery and alignment with business goals.
  • Design Best Practices: Advocate for and educate the team on modern design principles, usability, and industry trends.


Qualifications

  • 6 to 12 years of experience in Graphic Design and UI/UX Design, preferably within the e-commerce domain.
  • Proven experience leading and managing a design team.
  • Strong portfolio showcasing successful graphic and UI/UX work.
  • Excellent communication, leadership, and collaboration skills.
  • Ability to adapt to evolving project requirements and thrive in a fast-paced environment.


Read more
Leading MNC in Tiles and Bath-ware

Leading MNC in Tiles and Bath-ware

Agency job
via Masflair by Minesh Hemani
Lucknow, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune
2 - 5 yrs
₹8L - ₹14L / yr
Business Development
Sales
Channel Sales

Business Development Executive (Floor and Wall tiles)

About The Company:

A leading MNC in tiles and bathware, with operations in over 40 countries and a group turnover of more than USD 1 billion. The company is strengthening its footprint in India by expanding its sales and business development team.

Job Title: Business Development Executive

Locations: Delhi, Lucknow, and Pune

Responsibilities:

  • Achieve regional client activation and sales targets through proactive territory management.
  • Secure product specifications, approvals, and conversions by providing strategic sales guidance to clients and stakeholders.
  • Meet sales targets across all product lines by planning and executing targeted sales initiatives.
  • Collaborate with the sales team to retain and grow the customer base, building strong relationships with key accounts and identifying new business opportunities.
  • Develop and maintain relationships with architects, builders, interior designers, influencers, government authorities, and corporate customers to drive project specifications and orders.
  • Leverage a strong stakeholder network to drive sustainable revenue growth across assigned markets.

Benefits and Perks: Rs. 8 – 10 LPA (including 10% variable pay).

Qualifications:

  • Industry Experience: Minimum 2 years in sales and business development, specifically in tiles and bathware.
  • Education: Bachelor's or Master's Degree
  • Languages and Communication: Fluency in English and the relevant regional language is mandatory.


Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort