11+ Shared services Jobs in India
Apply to 11+ Shared services Jobs on CutShort.io. Find your next job, effortlessly. Browse Shared services Jobs and apply today!


We are looking for an exceptional Software Developer for our Data Engineering India team who can-
contribute to building a world-class big data engineering stack that will be used to fuel us
Analytics and Machine Learning products. This person will be contributing to the architecture,
operation, and enhancement of:
Our petabyte-scale data platform with a key focus on finding solutions that can support
Analytics and Machine Learning product roadmap. Everyday terabytes of ingested data
need to be processed and made available for querying and insights extraction for
various use cases.
About the Organisation:
- It provides a dynamic, fun workplace filled with passionate individuals. We are at the cutting edge of advertising technology and there is never a dull moment at work.
- We have a truly global footprint, with our headquarters in Singapore and offices in Australia, United States, Germany, United Kingdom, and India.
- You will gain work experience in a global environment. We speak over 20 different languages, from more than 16 different nationalities and over 42% of our staff are multilingual.
Job Description
Position:
Software Developer, Data Engineering team
Location: Pune(Initially 100% Remote due to Covid 19 for coming 1 year)
- Our bespoke Machine Learning pipelines. This will also provide opportunities to
contribute to the prototyping, building, and deployment of Machine Learning models.
You:
- Have at least 4+ years’ Experience.
- Deep technical understanding of Java or Golang.
- Production experience with Python is a big plus, extremely valuable supporting skill for
us.
- Exposure to modern Big Data tech: Cassandra/Scylla, Kafka, Ceph, the Hadoop Stack,
Spark, Flume, Hive, Druid etc… while at the same time understanding that certain
problems may require completely novel solutions.
- Exposure to one or more modern ML tech stacks: Spark ML-Lib, TensorFlow, Keras,
GCP ML Stack, AWS Sagemaker - is a plus.
- Experience includes working in Agile/Lean model
- Experience with supporting and troubleshooting large systems
- Exposure to configuration management tools such as Ansible or Salt
- Exposure to IAAS platforms such as AWS, GCP, Azure…
- Good addition - Experience working with large-scale data
- Good addition - Good to have experience architecting, developing, and operating data
warehouses, big data analytics platforms, and high velocity data pipelines
**** Not looking for a Big Data Developer / Hadoop Developer
A business development executive (BDE) is responsible for driving a company's growth by generating revenue, increasing sales. They do this by identifying new business opportunities, building relationships with clients, and developing strategic partnerships.
Responsibilities
- Identify new business opportunities: Research market trends, analyze competitors, and identify new business opportunities
- Build relationships with clients: Identify client needs, establish long-term relationships, and ensure client satisfaction
- Develop sales campaigns: Set sales goals, evaluate sales team performance, and develop sales campaigns
- Conduct market research: Perform in-depth market research to identify business opportunities
- Develop business strategy: Monitor sales and KPIs to evaluate and optimize business strategy
- Use negotiation skills: Use negotiation skills to build relationships and close contracts with clients
- Promote company products: Promote the company's services to clients
- Collaborate with other departments: Collaborate with sales, marketing & development team.
Skills
- Strong communication skills, including verbal, written, and presentation skills
- Ability to listen and empathize
- Ability to build rapport and maintain relationships
- Ability to use presentation tools
- Ability to take up new challenges and overcome obstacles
Key Responsibilities:
- Leading, planning and executing the creative production of all our health & fitness solutions
- Defining and building the content production roadmap with instructors & content managers
- Drive programming strategy and regularly evaluating performance of the strategy
- Serve as overarching lead for the entire production team including producers, editors & talent
- Spearhead innovation & creativity among the content, instructor, celebs and production teams
- Production vendor management, onboarding & negotiations to create content solutions
- Partner with instructors & in-house marketing teams to build compelling brand stories
Qualification & Experience:
- 10+ years of related experience in production, experience in live studio production is a plus
- Strong leadership and communication skills
- Previous experience managing and leading a production team
- Passion for content creation & health and fitness
- Strong network of production houses, vendors & talent

LogiNext is looking for a technically savvy and passionate Junior Software Engineer - Data Science to analyze large amounts of raw information to find patterns that will help improve our company. We will rely on you to build data products to extract valuable business insights.
In this role, you should be highly analytical with a knack for analysis, math and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine-learning and research.
Your goal will be to help our company analyze trends to make better decisions. Without knowledge of how the software works, data scientists might have difficulty in work. Apart from experience in developing R and Python, they must know modern approaches to software development and their impact. DevOps continuous integration and deployment, experience in cloud computing are everyday skills to manage and process data.
Responsibilities:
Identify valuable data sources and automate collection processes Undertake preprocessing of structured and unstructured data Analyze large amounts of information to discover trends and patterns Build predictive models and machine-learning algorithms Combine models through ensemble modeling Present information using data visualization techniques Propose solutions and strategies to business challenges Collaborate with engineering and product development teams
Requirements:
Bachelors degree or higher in Computer Science, Information Technology, Information Systems, Statistics, Mathematics, Commerce, Engineering, Business Management, Marketing or related field from top-tier school 0 to 1 year experince in in data mining, data modeling, and reporting. Understading of SaaS based products and services. Understanding of machine-learning and operations research Knowledge of R, SQL and Python; familiarity with Scala, Java or C++ is an asset Knowledge using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop) Analytical mind and business acumen and problem-solving aptitude Excellent communication and presentation skills Proficiency in Excel for data management and manipulation Experience in statistical modeling techniques and data wrangling Able to work independently and set goals keeping business objectives in mind
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.
Role & Responsibilities:
Your role is focused on Design, Development and delivery of solutions involving:
• Data Integration, Processing & Governance
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Implement scalable architectural models for data processing and storage
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 5+ years of IT experience with 3+ years in Data related technologies
2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Cloud data specialty and other related Big data technology certifications
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
About BootLabs
https://www.google.com/url?q=https://www.bootlabs.in/&sa=D&source=calendar&ust=1667803146567128&usg=AOvVaw1r5g0R_vYM07k6qpoNvvh6" target="_blank">https://www.bootlabs.in/
-We are a Boutique Tech Consulting partner, specializing in Cloud Native Solutions.
-We are obsessed with anything “CLOUD”. Our goal is to seamlessly automate the development lifecycle, and modernize infrastructure and its associated applications.
-With a product mindset, we enable start-ups and enterprises on the cloud
transformation, cloud migration, end-to-end automation and managed cloud services.
-We are eager to research, discover, automate, adapt, empower and deliver quality solutions on time.
-We are passionate about customer success. With the right blend of experience and exuberant youth in our in-house team, we have significantly impacted customers.
Technical Skills:
• Expertise in any one hyper scaler (AWS/AZURE/GCP), including basic services like networking,
data and workload management.
- AWS
Networking: VPC, VPC Peering, Transit Gateway, Route Tables, Security Groups, etc.
Data: RDS, DynamoDB, Elastic Search
Workload: EC2, EKS, Lambda, etc.
- Azure
Data: Azure MySQL, Azure MSSQL, etc.
Workload: AKS, Virtual Machines, Azure Functions
- GCP
Data: Cloud Storage, DataFlow, Cloud SQL, Firestore, BigTable, BigQuery
Workload: GKE, Instances, App Engine, Batch, etc.
• Experience in any one of the CI/CD tools (Gitlab/Github/Jenkins) including runner setup,
templating and configuration.
• Kubernetes experience or Ansible Experience (EKS/AKS/GKE), basics like pod, deployment,
networking, service mesh. Used any package manager like helm.
• Scripting experience (Bash/python), automation in pipelines when required, system service.
• Infrastructure automation (Terraform/pulumi/cloud formation), write modules, setup pipeline and version the code.
Optional:
• Experience in any programming language is not required but is appreciated.
• Good experience in GIT, SVN or any other code management tool is required.
• DevSecops tools like (Qualys/SonarQube/BlackDuck) for security scanning of artifacts, infrastructure and code.
• Observability tools (Opensource: Prometheus, Elasticsearch, Open Telemetry; Paid: Datadog,
24/7, etc)
About the Company
Peacock Engineering Ltd is a Gold-accredited IBM Premier Business Partner which has amassed over 300 person years of experience implementing business critical EAM (Enterprise Asset Management) solutions across a range of industries such as oil & gas, pharmaceuticals, utilities, facilities management, transport, and power generation.
Peacock Engineering Ltd specialise in providing consultancy services and support for the IBM Maximo EAM software product and maintain a pool of highly experienced and capable consultants fully conversant with IBM Maximo and its functionality, capabilities, and opportunities for customisation to meet business need.
Main Purpose:
Peacock Engineering’s Technical Services team is now looking for an IBM Maximo Technical Professional to support the growing demand for Maximo enterprise asset management solutions, working from our office in Bangalore.
Specific Responsibilities:
- Technical expert in IBM Maximo EAM technology.
- Should be well versed in MBO customizations for Maximo 7.x version.
- Advanced JAVA, SQL knowledge.
- Maximo building and deploying to various instances.
- Business process management using workflow design and management.
- Expert Knowledge of Maximo Integration Framework (MIF).
- Provide technical services over the entire lifecycle of a project.
- communication skills (verbal and written) possess the ability to multi-task.
- Maximo installations and upgrade work experience
- Participate in solution architecture design.
- Perform application and solution development to meet project requirements.
- Develop and document detailed technical designs to meet business requirements.
- Manage multiple technical environments and support the development and testing processes.
- Lead or assist in data conversion and migration efforts.
- Configure Maximo and assist in the development of interfaces to external systems.
- Identify areas of customization and optimization and provide solutions that meet the business requirements.
- Conduct system testing, as necessary.
Skill Requirements - Essential:
- Tech. in Computer Science, Engineering or Business-related field and/or equivalent work experience.
- Strong Maximo technical knowledge required to help execute numerous projects.
- Minimum eight (8) years of work experience in a technical position with the implementation and utilization of fully integrated enterprise asset management system.
- Proficient to convert functional requirements into technical specifications, and configure, tailor and or customize the solutions including building interfaces.
- Ability to create and update advanced technical documentation.
- Strong communication skills and the ability to work well in a project team environment.
- Drafting/Reviewing Functional Specifications
- Drafting/Reviewing Technical Specifications
Skill Requirements - Preferable:
- To bring industry knowledge world class capabilities innovation and cutting-edge technology to our clients in the Resources industry to deliver business value.
- To work with leading Resources client’s major customers and suppliers to develop and execute projects and reliability strategies.
- To harness extensive knowledge combined with an integrated suite of methods people and assets to deliver sustainable long-term solution.
- IBM Maximo 7.x certification
Person Specification/Attributes:
- Professional and committed, with a disciplined approach to work.
- Motivated and driven by finding and providing solutions to problems.
- Polite, tactful, helpful, empathic nature, able to deliver to the needs of customers.
- Has respect for others and their views.
- Technology minded and focused, enthusiastic about technologies.
- Analytical, able to raise from the detail and see the bigger picture.
- Dedicated to continually updating and upgrading own knowledge.
- Carries a mind-set of continuous improvement, constantly looking for better and more efficient ways of doing things.
- Values quality at the centre of all things in work.
Due to considerable amount of virtual working and interaction with colleagues and customers in different physical locations internationally, it is essential that the successful applicant has the drive and ethic to succeed working in small teams physically but in larger efforts virtually. Self-drive to communicate constantly using web collaboration and video conferencing is essential.
As an employee, you will be encouraged to continually develop your capability & attain certifications to reflect your growth as an individual.

DemandMatrix Inc. is a data company that provides Go To Market, Operations and Data Science teams with high quality company level data and intelligence. DemandMatrix uses advanced data science methodologies to process millions of unstructured data points that produce reliable and accurate technology intelligence, organizational intent and B2B spend data. Our product solves challenging problems for our clients such as Microsoft, DocuSign, Leadspace and many more.
Job Description
We use machine learning and narrow-AI to find companies and the products they are using. This is done by researching millions of publicly available sources and over a billion documents per month. We are looking for Tech Lead who loves tech challenges and is a problem solver. This will give you an opportunity to brainstorm ideas and implement solutions from scratch.
What will you do?
Will be part of the team responsible for our product roadmap.
You will be involved in rapid prototyping and quick roll-outs of ideas, in fast-paced environments working alongside some of the most talented and smartest people in the industry.
Lead a team and deliver the best output in an agile environment.
Communicate effectively, both orally and in writing with a globally distributed team.
Who Are You?
Designed and built multiple web services and data pipelines with a large volume.
Genuinely excited about technology and worked on projects from scratch.
A highly-motivated individual who thrives in an environment where problems are open-ended.
Must have
- 7+ years of hands-on experience in Software Development with a focus on microservices and data pipelines.
- Minimum 3 years of experience to build services and pipelines using Python.
- Minimum 1 year of experience with a large volume of data using MongoDB.
- Minimum 1 year of hands-on experience with big data pipelines and data warehouse.
- Experience with designing, building & deploying scalable & high available systems with AWS or GCP.
- Experience with Java
- Experience with Docker / Kubernetes
- Exposure to React.js for front-end development
Additional Information
- Flexible Working hours
- Entire Work From Home
- Birthday Leave
- Remote Work

