
Position Overview
We are seeking a highly skilled Senior Salesforce Service Cloud Developer who will take complete ownership of all Service Cloud development activities—ranging from design and configuration to customization, integration, and deployment. The ideal candidate will have end-to-end control of the technical solution, ensuring high-quality, scalable, and maintainable implementations that align with business requirements and best practices.
This is a hands-on role that requires deep technical expertise, strong problem-solving abilities, and the ability to operate with minimal supervision while collaborating closely with stakeholders.
Key Responsibilities
Take full ownership of Salesforce Service Cloud development lifecycle requirements gathering, solution design, development, testing, deployment, and post-deployment support.
Translate business needs into technical solutions using Service Cloud features such as Cases, Knowledge, Omni-Channel, Live Agent, Service Console, Entitlements, and Milestones.
Design and develop custom Apex classes, triggers, Lightning Web Components (LWC), Visualforce pages, flows, and process builders.
Implement integrations with external systems using REST/SOAP APIs, middleware, or other integration tools.
Configure and manage Service Cloud console, page layouts, record types, queues, and assignment rules.
Maintain code quality through version control, peer review, and adherence to Salesforce best practices.
Create and maintain technical documentation for customizations, integrations, and configurations.
Proactively identify system improvements, performance enhancements, and automation opportunities.
Serve as the go-to technical expert for Salesforce Service Cloud within the organization.
Ensure compliance with Salesforce security standards, governance, and data privacy policies.
Required Skills & Qualifications
7+ years of hands-on Salesforce development experience, including 3+ years specifically in Service Cloud.
Strong command of Apex, LWC, SOQL, SOSL, Visualforce, Flows, and Process Automation.
Proven experience in Service Cloud setup and customization including console configuration, case management, omni-channel routing, and knowledge management.
Expertise in Salesforce APIs and integrations (REST, SOAP, platform events).
Deep understanding of Salesforce security model, sharing rules, and data architecture.
Proficiency in Salesforce DevOps tools (e.g., Copado, Gearset, or equivalent) and version control systems (Git).
Salesforce Platform Developer II and Service Cloud Consultant certifications strongly preferred.
Strong analytical, troubleshooting, and problem-solving skills.
Ability to manage full development ownership in a fast-paced environment with minimal supervision.
Preferred Skills
Experience with Einstein Bots, Field Service Lightning, or other Salesforce add-ons.
Familiarity with Agile/Scrum methodology and working with cross-functional teams.
Knowledge of testing frameworks (Apex tests, Selenium, Provar, etc.).

Similar jobs
- 5+ years of experience with a minimum of 3 yrs in Automation
- Mobile testing experience with strong basics, understanding of testing and shipping releases on these platforms - iOS, Android, and Web
- Prior experience in creating optimal test strategy - functional, non-functional, analytics, and automation
- Experience with relational database and SQL queries
- Experience of working with JIRA
- Prior experience in working with tools like -
- Figma
- Datadog
- AWS
Data engineers:
Designing and building optimized data pipelines using cutting-edge technologies in a cloud environment to drive analytical insights.This would also include develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity
Constructing infrastructure for efficient ETL processes from various sources and storage systems.
Collaborating closely with Product Managers and Business Managers to design technical solutions aligned with business requirements.
Leading the implementation of algorithms and prototypes to transform raw data into useful information.
Architecting, designing, and maintaining database pipeline architectures, ensuring readiness for AI/ML transformations.
Creating innovative data validation methods and data analysis tools.
Ensuring compliance with data governance and security policies.
Interpreting data trends and patterns to establish operational alerts.
Developing analytical tools, utilities, and reporting mechanisms.
Conducting complex data analysis and presenting results effectively.
Preparing data for prescriptive and predictive modeling.
Continuously exploring opportunities to enhance data quality and reliability.
Applying strong programming and problem-solving skills to develop scalable solutions.
Writes unit/integration tests, contributes towards documentation work
Must have ....
6 to 8 years of hands-on experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient, and distributed data pipelines.
High proficiency in Scala/Java/ Python API frameworks/ Swagger and Spark for applied large-scale data processing.
Expertise with big data technologies, API development (Flask,,including Spark, Data Lake, Delta Lake, and Hive.
Solid understanding of batch and streaming data processing techniques.
Proficient knowledge of the Data Lifecycle Management process, including data collection, access, use, storage, transfer, and deletion.
Expert-level ability to write complex, optimized SQL queries across extensive data volumes.
Experience with RDBMS and OLAP databases like MySQL, Redshift.
Familiarity with Agile methodologies.
Obsession for service observability, instrumentation, monitoring, and alerting.
Knowledge or experience in architectural best practices for building data pipelines.
Good to Have:
Passion for testing strategy, problem-solving, and continuous learning.
Willingness to acquire new skills and knowledge.
Possess a product/engineering mindset to drive impactful data solutions.
Experience working in distributed environments with teams scattered geographically.
*JAVA DEVELOPER*– (Mumbai churchgate- Onsite)
*Job brief*
Java developer roles and responsibilities include managing Java/Java EE application development while providing expertise in the full software development lifecycle, from concept and design to testing.
*Requirements and skills*
• 6 to 10 years of experience in application development.
• Hands on experience in designing and developing applications using Java EE platforms.
• Excellent knowledge of Relational Databases, SQL and ORM technologies (JPA2, Hibernate).
• Experience in the Spring Framework.
• Experience in microservices development, Azure repo knowledge.
• Good to have knowledge of containerization/docker, azure pipeline/CICD.
• Good to have knowledge of frontend technologies. Ex- angular, JavaScript, TypeScript.
Good understanding on OpenShift administration and monitoring.
Monitor system events to ensure health, maximum system availability and service quality
The candidate must have experience in larger cluster administration.
Good knowledge of OpenShift. Good troubleshooting skill in docker.
Kubernetes installation hands-on experience.
- Creating user flows, wireframes, prototypes and mockups
- Translating requirements into style guides, design systems, design patterns and attractive user interfaces
- Designing UI elements such as input controls, navigational components and informational components
- Creating original graphic designs (e.g. images, sketches and tables)
- Identifying and troubleshooting UX problems (e.g. responsiveness)
- Collaborating effectively with product, engineering, and management teams
- Incorporating customer feedback, usage metrics, and usability findings into design in order to enhance user experience
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Author data services using a variety of programming languages
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Azure ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centres and Azure regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Work in an Agile environment with Scrum teams.
- Ensure data quality and help in achieving data governance.
Basic Qualifications
- 2+ years of experience in a Data Engineer role
- Undergraduate degree required (Graduate degree preferred) in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
- Experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases
- Experience with data pipeline and workflow management tools
- Experience with Azure cloud services: ADLS, ADF, ADLA, AAS
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
- Understanding of ELT and ETL patterns and when to use each. Understanding of data models and transforming data into the models
- Experience building and optimizing ‘big data’ data pipelines, architectures, and data sets
- Strong analytic skills related to working with unstructured datasets
- Build processes supporting data transformation, data structures, metadata, dependency, and workload management
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
- Experience supporting and working with cross-functional teams in a dynamic environment
Mandatory Skill set:
- Multiple solution stack development experience along with MEAN(Mongo, Express, Angular and Node)
- Python with any web framework
- Micro Services, Performance Engineering, Docker, Kubernetes, AZURE, CI/CD pipeline, , MySQL and Database design
Job Role:
- Responsible for component selection, design standardization, common library building.
- Build services and templates based on published standard (RFC type)Build coding templates and architypes adhering to design standard.
- Understand NFR and define architecture, design and validation for each NFR element.
- Design data model, service contract and document framework.Define and ensure coding standards
- Define and ensure UI standards are followed. Design micro services, security, deployment Well-versed in module level effort estimationWell-versed in application integration patterns Exposure to bug fixing, maintenance, continuous integration releases.
- Exposure to building architectural view like logical, physical and deployment.Exposure to performance bottleneck, RCA and remediation.Exposure to security issues, RCA and remediation.
Process Exposure:
- Able to interact with team, stakeholders and architects. Understand functional requirements and create design documents for features.
- Participate in sprint planning and story elaboration sessions.
- Candidate should be able to work independently in an Agile project.
- Experience with Test-Driven Development, Integration testing and Agile processes.Code review from a standard and design adherence perspective.






