
- 5+ years of experience in software development.
- At least 2 years of relevant work experience on large scale Data applications
- Good attitude, strong problem-solving abilities, analytical skills, ability to take ownership as appropriate
- Should be able to do coding, debugging, performance tuning, and deploying the apps to Prod.
- Should have good working experience Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)
- Kafka
- J2EE Frameworks (Spring/Hibernate/REST)
- Spark Streaming or any other streaming technology.
- Java programming language is mandatory.
- Good to have experience with Java
- Ability to work on the sprint stories to completion along with Unit test case coverage.
- Experience working in Agile Methodology
- Excellent communication and coordination skills
- Knowledgeable (and preferred hands-on) - UNIX environments, different continuous integration tools.
- Must be able to integrate quickly into the team and work independently towards team goals
- Take the complete responsibility of the sprint stories’ execution
- Be accountable for the delivery of the tasks in the defined timelines with good quality
- Follow the processes for project execution and delivery.
- Follow agile methodology
- Work with the team lead closely and contribute to the smooth delivery of the project.
- Understand/define the architecture and discuss the pros-cons of the same with the team
- Involve in the brainstorming sessions and suggest improvements in the architecture/design.
- Work with other team leads to get the architecture/design reviewed.
- Work with the clients and counterparts (in US) of the project.
- Keep all the stakeholders updated about the project/task status/risks/issues if there are any.

About Clairvoyant India Private Limited
About
Connect with the team
Similar jobs
Job Title: Backend Developer
Experience: 2–7 Years
Location: On-site – Bangalore
Employment Type: Full-Time
Company: Pepsales AI (Multiplicity Technologies Inc.)
About Pepsales
Pepsales AI is a real-time sales enablement and conversation intelligence platform built for B2B SaaS sales teams. It empowers sellers across every stage of the sales cycle—before, during, and after discovery calls—by providing actionable insights that move deals forward.
- For Account Executives: Pepsales AI transforms the discovery process, ensuring objective deal qualification and frictionless handoffs to solution engineers—enabling AEs to focus on winning, not chasing.
- For Solution Engineers and Consultants: It elevates demo experiences by delivering real-time buyer context and actionable insights, ensuring every interaction is highly personalized and impactful.
- For Sales Leaders: It provides enterprise-grade intelligence across forecasting, pipeline health, team performance, coaching, and the authentic voice of the customer—empowering data-driven decision-making at scale.
With Pepsales AI, sales teams run sharper meetings, accelerate deal cycles, and close with confidence.
Role Overview
We’re seeking a passionate Backend Developer to join our fast-paced team in Bangalore and help build and scale the core systems powering Pepsales. In this full-time, on-site role, you’ll work on high-impact features that directly influence product innovation, customer success, and business growth.
You’ll collaborate closely with the founding team and leadership, gaining end-to-end ownership and the chance to bring bold, innovative ideas to life in a rapidly scaling startup environment
Key Responsibilities
- Design, develop, and maintain scalable backend systems and microservices.
- Write clean, efficient, and well-documented code in Python.
- Build and optimize RESTful APIs and WebSocket services for high performance.
- Manage and optimize MongoDB databases for speed and scalability.
- Deploy and maintain containerized applications using Docker.
- Work extensively with AWS services (EC2, ALB, S3, Route 53) for robust cloud infrastructure.
- Implement and maintain CI/CD pipelines for smooth and automated deployments.
- Collaborate closely with frontend engineers, product managers, and leadership on architecture and feature planning.
- Participate in sprint planning, technical discussions, and code reviews.
- Take full ownership of features, embracing uncertainty with a problem-solving mindset.
Required Skills & Qualifications
- 2–7 years of backend development experience with a proven track record of building scalable systems.
- Strong proficiency in Python and its ecosystem.
- Hands-on experience with MongoDB, Docker, and Microservice Architecture.
- Practical experience with AWS services (EC2, ALB, S3, Route 53).
- Familiarity with CI/CD tools and deployment best practices.
- Strong understanding of REST API design principles and WebSocket communication.
- Excellent knowledge of data structures, algorithms, and performance optimization.
- Strong communication skills and ability to work in a collaborative, fast-paced environment.
What We Value
- Excitement to work on cutting-edge technology and platforms , helping redefine how businesses engage and convert customers.
- Thrives in a dynamic startup environment that values diversity, rapid innovation, and a growth mindset—adapting to change, challenging the status quo, and making a real impact.
- Passion for ownership beyond coding, contributing to product strategy and innovation.
Why Join Pepsales?
- Direct access to founders and a voice in high-impact decisions.
- Opportunity to shape a next-gen AI SaaS product transforming sales worldwide.
- Exposure to cutting-edge technologies in a rapidly growing startup.
- Ownership-driven culture with fast career growth and learning opportunities.
- A collaborative, innovation-driven environment that values creativity and problem-solving.
Sholinganallur location: (Videocon D2H Process) (60 Open positions)
Process: Videocon D2H
Inbound voice process-Tamil
Should be flexible for rotational shifts
Male/Female candidates
Qualification: +2 Pass & above
Should reside near Sholinganallur locations Immediate joiners only
CTC: 11000/Fixed (No ESI & PF)
Address:
Tek Meadows Campus
Rajiv Gandhi Salai, Chennai, Tamil Nadu 600119.
Opp Accenture & Near Dollar Bus Stop
Data Delivery Manager is responsible for managing and overseeing the end-to-end delivery of data
solutions, from data architecture and ingestion to processing, analysis, and reporting. This role
ensures that data projects are completed on time, within scope, and aligned with business
objectives. The Data Delivery Manager works with cross-functional teams, including data
engineers, analysts, and other stakeholders, to deliver high-quality data solutions. Key
Responsibilities: Project and Program Management: o Lead and manage the delivery of data-
driven projects, ensuring they are completed on time and within budget. o Develop and maintain
project plans, timelines, and resource allocation. o Monitor project progress and proactively
identify risks, implementing solutions to ensure successful project completion. o Ensure alignment
between data initiatives and business goals, maintaining continuous communication with
stakeholders. Stakeholder Engagement: o Serve as the primary point of contact between business
stakeholders and the data team, translating business requirements into technical specifications. o
Collaborate with business leaders to understand data needs and develop solutions that meet
those needs. o Provide regular updates on project status, risks, and key deliverables to
stakeholders and leadership teams. o Facilitate meetings to gather requirements and
communicate project milestones. Data Strategy & Architecture: o Oversee the design and
implementation of data solutions that support business objectives. o Ensure adherence to data
governance and quality standards throughout the data pipeline lifecycle. o Collaborate with data
architects to define scalable, secure, and high-performance data architectures. o Support data
integration initiatives across different platforms, ensuring smooth and efficient data flow. Team
Leadership and Development: o Manage cross-functional teams of data engineers, analysts, and
other professionals to deliver projects effectively. o Provide leadership and mentoring to team
members, fostering a culture of collaboration, innovation, and continuous learning. o Set clear
expectations, performance goals, and career development opportunities for the team. o Address
and resolve any team challenges, ensuring high levels of productivity and motivation. Quality
Assurance: o Ensure the accuracy, reliability, and quality of data solutions by implementing robust
testing, validation, and review processes. o Develop and enforce data standards, best practices,
and methodologies to ensure consistent, high-quality deliverables. o Drive continuous
improvement initiatives, analyzing past projects and identifying areas for enhancement. Risk
Management and Troubleshooting: o Identify potential risks to project delivery, including data
pipeline issues, resource shortages, or technical blockers, and implement mitigation strategies. o
Troubleshoot and resolve data-related issues and challenges during the project lifecycle.
Reporting and Analytics: o Oversee the creation and maintenance of data dashboards, reports, and
performance metrics. o Ensure that data solutions provide actionable insights for the business,
driving informed decision-making. Experience: o 5+ years of experience in managing programs
related to data management, data engineering, or analytics. o Proven track record in managing
data projects and delivering data solutions in a timely manner. o Experience with data integration,
data warehousing, and data modeling. o Strong understanding of data pipeline management, ETL
processes, and data analytics. Skills: o In-depth knowledge of data management tools and
technologies (e.g., SQL, NoSQL, ETL tools, cloud platforms such as AWS, Google Cloud, or Azure).
o Proficient in project management methodologies (Agile, Scrum, Waterfall). o Strong leadership
and team management skills, with the ability to drive collaboration across different functions. o
Excellent communication skills with the ability to translate complex data-related concepts into
understandable terms for non-technical stakeholders. o Analytical mindset with strong problem-
solving abilities. o Knowledge of data visualization and reporting tools (e.g., Power BI, Tableau,
Looker) is a plus. Certifications: o Project Management Professional (PMP) or Scrum Master
certification (preferred). o Data-related certifications (e.g., AWS Certified Big Data, Google
Professional Data Engineer) are a plus. Preferred Skills: o Experience with big data technologies
(e.g., Hadoop, Spark, Kafka). o Familiarity with machine learning or AI applications related to data
solutions. o Experience with data governance and compliance (e.g., GDPR).
What You'll Do
- Responsible for achieving quarterly and annual sales quota
- Conducts sales needs analysis with new and prospective customers, including the development of client-centric product solutions.
- Generates leads with the support of SDRs by contacting prospective clients through cold reach outs, networking and industry events.
- Qualifies new leads and determines serviceability of prospects
- Understands the communication needs of enterprise customers, and designs solutions to meet those unique business needs.
- Designs develops and delivers sales proposals and presentations on product benefits.
- Maintains all sales databases necessary to report sales activity and customer information with the support of SDRs
What you should have
- 3-8 years experience in similar roles; Experience as Account Executive or Sales Development Representative at any technical product company is a big plus!
- Flawless communication skills, both written and oral, with extensive public speaking experience
- Demonstrated ability to work solo as well as being a productive team member, making outbound reach-outs every day
- Have a strong work ethic and are eager to learn and make new connections with prospects
The FullStack Developer primarily participates in designing and implementing new services as well as participates in customer delivery. It will be possible to influence your work profile based on your interests.
While not mandatory, a generic understanding of user interfaces and being able to position oneself as an end-user is highly desirable. We appreciate also prior knowledge of the Image Processing domain.
We are looking for well-rounded people who care about their craft and understand software development. While we value a formal degree in computer science we do not require one - a candidate with a strong background, open mind and ability to learn would be an ideal addition to our team.
That being said, here is what we think the ideal team member would be:
Responsibilities :
- Develop and Integrate Django backend and API's with frontend framework.
- Develop Angular applications
- Working closely with UX and Front-end Developers
- Participating in architectural, design and product discussions
- Designing and creating RESTful APIs for internal and partner consumption
- Working in an agile environment with an excellent team of engineers
Requirements :
- Good experience developing Angular applications into production
- Good experience with the Django REST framework
- Strong understanding of the Angular framework as well as javascript, jQuery and HTML/CSS
- Good experience on designing and structuring of database most likely ORM (i.e MongoDB)
- Knowledge of how to build and use with RESTful APIs
- Good experience of Python coding skills
- Strong knowledge of version control (i.e. git, svn, etc.)
- Experience deploying Python applications into production
Qualification: Engineering from CSE/IT
Location: Bengaluru
Department: - Engineering
Bidgely is looking for extraordinary and dynamic Senior Data Analyst to be part of its core team in Bangalore. You must have delivered exceptionally high quality robust products dealing with large data. Be part of a highly energetic and innovative team that believes nothing is impossible with some creativity and hard work.
● Design and implement a high volume data analytics pipeline in Looker for Bidgely flagship product.
● Implement data pipeline in Bidgely Data Lake
● Collaborate with product management and engineering teams to elicit & understand their requirements & challenges and develop potential solutions
● Stay current with the latest tools, technology ideas and methodologies; share knowledge by clearly articulating results and ideas to key decision makers.
● 3-5 years of strong experience in data analytics and in developing data pipelines.
● Very good expertise in Looker
● Strong in data modeling, developing SQL queries and optimizing queries.
● Good knowledge of data warehouse (Amazon Redshift, BigQuery, Snowflake, Hive).
● Good understanding of Big data applications (Hadoop, Spark, Hive, Airflow, S3, Cloudera)
● Attention to details. Strong communication and collaboration skills.
● BS/MS in Computer Science or equivalent from premier institutes.
Your day-to-day tasks will be
• Helping our product squads to bring the best testing practices into their workflow
• Understanding the flow of code and how it interacts with different components
• Understanding the product functionality and product objectives to implement
• Code contributions in the backend, and/or frontend with extensive unit tests
• Working with deployments teams and offering inputs on the testability of functional elements and designs
• Researching tools, methodologies, and trends and upgrading existing practices and processes
• Mentor junior engineers, do code reviews and help improve/maintain code quality
This might be for you if…
You have a passion for designing, development, and testing and you should enjoy upgrading your tech skills as tools and technologies evolve quickly.
• A Bachelor's degree in Computer Science, Computer Engineering, or a related field.
• 4+ years of overall professional web development experience including coding, scalability of codes and development, testing and
deployment best practices
• Firefighter – You love taking up fire calls on production or other critical systems
• Communicate effectively in fluent English
• Exposure to modern development environment including Node, Java, JavaScript, Typescript, Version Control, GIT, AWS, NoSQL
(MongoDB, DynamoDB), ReactJS
• Understands the value and impact of our product for prospective customers and is driven to make it a success
• Self-motivated, hard-working, coachable, and driven with a strong entrepreneurial spirit
• Enjoys working in a collaborative atmosphere where ideas are valued on merit
Company Overview
Vamstar is a data science powered global B2B healthcare marketplace platform. We Aggregate $2 trillion of demand for healthcare products
and services using machine learning and providing real-time insights to buyers and suppliers to accelerate transactions. Our enriched data on
buyers, outcomes, and contracts (including tenders) reveal interdependencies across complex contracting markets. By seeing the big picture
and all the connections, Vamstar provides healthcare stakeholders with valuable market insights and perspectives. Vamstar partners with
leaders in the industry, academia, and government in 70 countries to apply higher-level thinking to daily tasks and strategic issues










