
Similar jobs

Hi,
We are seeking a senior data leader with deep functional expertise in Salesforce Sales and Service domains to own the enterprise data model, metrics, and analytical outcomes supporting Sales, Service, and Customer Operations.
This role is business‑first and data‑centric. The successful candidate understands how Salesforce Sales Cloud and Service Cloud data is generated, evolves over time, and is consumed by business teams, and ensures analytics accurately reflect operational reality.
Snowflake serves as the enterprise analytics platform, but Salesforce domain mastery and functional data expertise are the primary requirements for success in this role.
Core Responsibilities
Salesforce Sales & Service Data Ownership
· Act as the data owner and architect for Salesforce Sales and Service domains.
- Own Sales data including leads, accounts, opportunities, pipeline, bookings, revenue, forecasting, and CPQ (if applicable).
- Own Service data including cases, case lifecycle, SLAs, backlog, escalations, and service performance metrics.
- Define and govern enterprise‑wide KPI and metric definitions across Sales and Service.
- Ensure alignment between Salesforce operational definitions and analytics/reporting outputs.
- Own cross‑functional metrics spanning Sales, Service, and the customer lifecycle (e.g., customer health, renewals, churn).
Business‑Driven Data Modeling
· Design Salesforce‑centric analytical data models that accurately reflect Sales and Service processes.
- Model sales stage progression, pipeline history, and forecast changes over time.
- Model service case lifecycle, SLA compliance, backlog aging, and resolution metrics.
- Handle Salesforce‑specific complexities such as slowly changing dimensions (ownership, territory, account hierarchies).
- Ensure data models support operational dashboards, executive reporting, and advanced analytics.
Analytics Enablement & Business Partnership
· Partner closely with Sales Operations, Service Operations, Revenue Operations, Finance, and Analytics teams.
- Translate business questions into trusted, reusable analytical datasets.
- Identify data quality issues or Salesforce process gaps impacting reporting and drive remediation.
- Enable self‑service analytics through well‑documented, certified data products.
Technical Responsibilities (Enabling Focus)
· Architect and govern Salesforce data ingestion and modeling on Snowflake.
- Guide ELT/ETL strategies for Salesforce objects such as Opportunities, Accounts, Activities, Cases, and Entitlements.
- Ensure reconciliation and auditability between Salesforce, Finance, and analytics layers.
- Define data access, security, and governance aligned with Salesforce usage patterns.
- Partner with data engineering teams on scalability, performance, and cost efficiency.
Required Experience & Skills
Salesforce Sales & Service Domain Expertise (Must‑Have)
· Extensive hands‑on experience working with Salesforce Sales Cloud and Service Cloud data.
- Strong understanding of sales pipeline management, forecasting, and revenue reporting.
- Strong understanding of service case workflows, SLAs, backlog management, and service performance measurement.
- Experience working directly with Sales Operations and Service Operations teams.
- Ability to identify when Salesforce configuration or process issues cause reporting inconsistencies.
Data & Analytics Expertise
· 10+ years working with business‑critical analytical data.
- Proven experience defining KPIs, metrics, and semantic models for Sales and Service domains.
- Strong SQL and analytical skills to validate business logic and data outcomes.
- Experience supporting BI and analytics platforms such as Tableau, Power BI, or MicroStrategy.
Platform Experience
· Experience using Snowflake as an enterprise analytics platform.
- Understanding of modern ELT/ETL and cloud data architecture concepts.
- Familiarity with data governance, lineage, and access control best practices.
Leadership & Collaboration
· Acts as a bridge between business stakeholders and technical teams.
- Comfortable challenging requirements using business and data context.
- Mentors engineers and analysts on Salesforce data nuances and business meaning.
- Strong communicator able to explain complex Salesforce data behavior to non‑technical leaders.
Thanks,
Ampera Talent Team
Pls Contact to sairam.akirala
@
kiaraglobalservices.com
798
981 217 8
❖ Experience & Knowledge:
❖ Minimum 2 years of experience in Sales/BD in department & Institutes & Industries.
❖ Previous experience in business-to-business (B2B) sales is a must.
- Candidates with Market knowledge of Electrical Panel, Automation Solution, Energy Saving Solutions & IOT products will be prefered.
- People who are familiar with Electrical/Automation products of ABB /SIEMENS /SCHNEIDER /MITSUBISHI will be preferred.
- Knowledge of LT / HT Switchgear & Electrical Control panels is preferable.
- Basic costing knowledge.
- Basic Taxation Knowledge.
- Qualification required is Diploma/ Btech in Electrical/Electronic .
❖ Skill
➢ Excellent organizational skills and a keen eye for detail
➢ Basic working in CRM / ERP.
➢ Should have good command in excel sheets / PPT etc.
➢ Excellent Communication Skill in English / Oriya is a must.
➢ Good Commercial & Negotiation Skill.
➢ Should be able to do presentations in front of customers / Clients.
➢ Excellent Follow Up / Negotiation and social problem-solving skills
About Jeeva.ai
At Jeeva.ai, we're on a mission to revolutionize the future of work by building AI employees that automate all manual tasks—starting with AI Sales Reps. Our vision is simple: "Anything that doesn’t require deep human connection can be automated & done better, faster & cheaper with AI." We’ve created a fully automated SDR using AI that generates 3x more pipeline than traditional sales teams at a fraction of the cost.
As a dynamic startup we are backed by Alt Capital (founded by Jack Altman & Sam Altman), Marc Benioff (CEO Salesforce), Gokul (Board Coinbase), Bonfire (investors in ChowNow), Techtsars (investors in Uber), Sapphire (investors in LinkedIn), Microsoft with $1M ARR in just 3 months after launch, we’re not just growing - we’re thriving and making a significant impact in the world of artificial intelligence.
As we continue to scale, we're looking for mid-senior Full Stack Engineers who are passionate, ambitious, and eager to make an impact in the AI-driven future of work.
About You
- Experience: 3+ years of experience as a Full Stack Engineer with a strong background in React, Python, MongoDB, and AWS.
- Automated CI/CD: Experienced in implementing and managing automated CI/CD pipelines using GitHub Actions and AWS Cloudformation.
- System Architecture: Skilled in architecting scalable solutions for systems at scale, leveraging caching strategies, messaging queues and async/await paradigms for highly performant systems
- Cloud-Native Expertise: Proficient in deploying cloud-native apps using AWS (Lambda, API Gateway, S3, ECS), with a focus on serverless architectures to reduce overhead and boost agility..
- Development Tooling: Proficient in a wide range of development tools such as FastAPI, React State Management, REST APIs, Websockets and robust version control using Git.
- AI and GPTs: Competent in applying AI technologies, particularly in using GPT models for natural language processing, automation and creating intelligent systems.
- Impact-Driven: You've built and shipped products that users love and have seen the impact of your work at scale.
- Ownership: You take pride in owning projects from start to finish and are comfortable wearing multiple hats to get the job done.
- Curious Learner: You stay ahead of the curve, eager to explore and implement the latest technologies, particularly in AI.
- Collaborative Spirit: You thrive in a team environment and can work effectively with both technical and non-technical stakeholders.
- Ambitious: You have a hunger for success and are eager to contribute to a fast-growing company with big goals.
What You’ll Be Doing
- Build and Innovate: Develop and scale AI-driven products like Gigi (AI Outbound SDR), Jim (AI Inbound SDR), Automate across voice & video with AI.
- Collaborate Across Teams: Work closely with our Product, GTM, and Engineering teams to deliver world-class AI solutions that drive massive value for our customers.
- Integrate and Optimize: Create seamless integrations with popular platforms like Salesforce, LinkedIn, and HubSpot, enhancing our AI’s capabilities.
- Problem Solving: Tackle challenging problems head-on, from data pipelines to user experience, ensuring that every solution is both functional and delightful.
- Drive AI Adoption: Be a key player in transforming how businesses operate by automating workflows, lead generation, and more with AI.
- Ensure correct and timely reporting
- Collaborating with another department
- Create and give presentations
- Help managers in evaluating performance (e.g. writing reports, analyzing data)
- Understand each department’s (e.g. Marketing, Sales, QA, Development, PMs & Operations) daily processes and goals
- performance tracking; ensure regular monitoring and report on plan vs. actual performance
- escalate possible issues on time
- Present updates to senior management
- Assist with the management of the project life cycle from inception to final delivery sign-off
- Provide an active role and use critical judgment in the development of all project deliverables
- Assist Project Manager (PM) to ensure project requirements, standards, and documentation are followed
- Assist PM with reporting on the project status and health
We are seeking a skilled NestJS/PostgreSQL Developer to join our development team. As a NestJS/PostgreSQL Developer, you will be responsible for designing and implementing server-side applications, APIs, and databases using NestJS framework and PostgreSQL. You will work closely with other developers, stakeholders, and project managers to deliver high-quality software solutions.
Responsibilities:
- Develop server-side applications and APIs using the NestJS framework.
- Design and implement efficient and scalable database schemas using PostgreSQL.
- Collaborate with front-end developers to integrate server-side logic with the user interface.
- Write clean and maintainable code following best practices and coding standards.
- Conduct code reviews and provide constructive feedback to improve code quality.
- Optimize application performance and troubleshoot any issues or bugs.
- Work closely with stakeholders to understand requirements and translate them into technical solutions.
- Participate in the entire software development lifecycle, including planning, designing, coding, testing, and deployment.
- Stay up-to-date with the latest trends and technologies in web development, NestJS, and PostgreSQL.
Requirements:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- Proven experience (2-5 years) in server-side application development using NestJS framework.
- Strong knowledge of TypeScript and JavaScript.
- Experience with PostgreSQL or other relational databases.
- Proficiency in building and consuming RESTful APIs.
- Familiarity with Git version control system.
- Good understanding of software development principles, design patterns, and best practices.
- Excellent problem-solving and debugging skills.
- Strong communication and collaboration skills.
- Ability to work independently as well as in a team environment.
Preferred Qualifications:
- Experience with other JavaScript frameworks such as Angular or React.
- Knowledge of microservices architecture and containerization (e.g., Docker).
- Familiarity with cloud platforms like AWS, Azure, or GCP.
- Experience with testing frameworks (e.g., Jest, Jasmine).
- Understanding of Agile/Scrum development methodologies.
Mid / Senior Big Data Engineer
Job Description:
Role: Big Data EngineerNumber of open positions: 5Location: PuneAt Clairvoyant, we're building a thriving big data practice to help enterprises enable and accelerate the adoption of Big data and cloud services. In the big data space, we lead and serve as innovators, troubleshooters, and enablers. Big data practice at Clairvoyant, focuses on solving our customer's business problems by delivering products designed with best in class engineering practices and a commitment to keep the total cost of ownership to a minimum.
Must Have:
- 4-10 years of experience in software development.
- At least 2 years of relevant work experience on large scale Data applications.
- Strong coding experience in Java is mandatory
- Good aptitude, strong problem solving abilities, and analytical skills, ability to take ownership as appropriate
- Should be able to do coding, debugging, performance tuning and deploying the apps to Prod.
- Should have good working experience on
- o Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)
- o Kafka
- o J2EE Frameworks (Spring/Hibernate/REST)
- o Spark Streaming or any other streaming technology.
- Strong coding experience in Java is mandatory
- Ability to work on the sprint stories to completion along with Unit test case coverage.
- Experience working in Agile Methodology
- Excellent communication and coordination skills
- Knowledgeable (and preferred hands on) - UNIX environments, different continuous integration tools.
- Must be able to integrate quickly into the team and work independently towards team goals
- Take the complete responsibility of the sprint stories' execution
- Be accountable for the delivery of the tasks in the defined timelines with good quality.
- Follow the processes for project execution and delivery.
- Follow agile methodology
- Work with the team lead closely and contribute to the smooth delivery of the project.
- Understand/define the architecture and discuss the pros-cons of the same with the team
- Involve in the brainstorming sessions and suggest improvements in the architecture/design.
- Work with other team leads to get the architecture/design reviewed.
- Work with the clients and counter-parts (in US) of the project.
- Keep all the stakeholders updated about the project/task status/risks/issues if there are any.
Experience: 4 to 9 years
Keywords: java, scala, spark, software development, hadoop, hive
Locations: Pune
Technology at Quolam accelerates process transformation and business growth for our clients. Our team uses leading edge technology, innovative thinking and agile processes.
This role demands that the individual must be master of the language grammar, and be very familiar with how to structure, design, implementation, and testing of the project based on one or more open-source technologies.
Individual should be expert in multiple technology stack, hands-on, design and write scalable applications. Individual should be able to work independently on large projects.
Individual should be able to grow and inspire the team's technical skills and keep up with the technological paradigm shift.
Job Responsibilities
10 % of Time
- Collaborate with internal teams and the Solution Architect to discuss software design and architecture and best practices.
- Learn & work on POC's of trending technology along with the Solution Architect
- Translate application storyboards and use cases into functional applications
- Ensures the code follows latest coding practices and industry standards
- Contribute development activities into various projects
- Ensure the best possible performance, quality, and responsiveness of applications
- Should be able to identify, categorize, parse out, articulate and fix problems that occur in applications
- Should understand concept of iterative development and leverage DevOps tools for CI & CD
- Form strong working relationships with your peers and across the team
- Mentoring and educating less experienced team members in related technologies using methods such as informal training, pair programming etc
- Project Participation
o Requirement Analysis
o Task level Estimation
o Technical Design review
o Coding & Unit Testing
o Performance optimization
o Code review
o Support
o Troubleshooting/Debugging
25 % of Time
- Early adoption of trending technologies
- Proactive communication/soft skills
- Deep knowledge of OOPS, and RESTful API services.
- Experience using well-known JavaScript frameworks and libraries, such as React / Angular / Vue etc.
- Deep expertise in one or more of the following technologies - PHP, NodeJS, .NET
- Knowledge of DevOps (Containerization / Jenkins Pipeline, etc.)
- Ability to write and high quality and secure code and understand performance issues
- Experience with database systems, with knowledge of SQL and NoSQL stores (e.g., MySQL, Oracle / MongoDB, SQL Server, etc.)
- Experience on cloud related technologies (AWS / Azure) could be add-on.
- Proficient in multiple stacks of technologies / systems / tools and focus on building depth and breadth of skills.
- Demonstrate the ability to build a work plan or parts of a work plan, as applicable for role
- Strong understanding of Agile methodologies.
- Development - .NET Core | PHP | HTML5/CSS | Angular | React | VueJS
- DevOps - GIT, Jenkins
- Cloud – AWS | Azure
- Database – Oracle | Mongo DB | MSSQL | MySQL | PostGreSQL
Soft Competencies
- Team Player
- Strong communication skills with ability to communicate complex technical concepts and align organization on decisions
- Ability to communicate well in English
- Sound problem-solving skills with the ability to quickly process complex information and present it clearly and simply
- Passionate about technology and excited about the impact of emerging/disruptive technologies
- Open to learning new ideas outside scope or knowledge skillset
- Creating a positive environment within the team
- Strong team player and be a technical lead for the team.
- Challenge the status quo
- Strong Python Coding skills and OOP skills
- Should have worked on Big Data product Architecture
- Should have worked with any one of the SQL-based databases like MySQL, PostgreSQL and any one of
- NoSQL-based databases such as Cassandra, Elasticsearch etc.
- Hands on experience on frameworks like Spark RDD, DataFrame, Dataset
- Experience on development of ETL for data product
- Candidate should have working knowledge on performance optimization, optimal resource utilization, Parallelism and tuning of spark jobs
- Working knowledge on file formats: CSV, JSON, XML, PARQUET, ORC, AVRO
- Good to have working knowledge with any one of the Analytical Databases like Druid, MongoDB, Apache Hive etc.
- Experience to handle real-time data feeds (good to have working knowledge on Apache Kafka or similar tool)
- Python and Scala (Optional), Spark / PySpark, Parallel programming








