


Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.
Role & Responsibilities:
Your role is focused on Design, Development and delivery of solutions involving:
• Data Integration, Processing & Governance
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Implement scalable architectural models for data processing and storage
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 5+ years of IT experience with 3+ years in Data related technologies
2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Cloud data specialty and other related Big data technology certifications
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes

Similar jobs
Key Responsibilities
Lead Generation & Prospecting
- Identify and connect with potential clients through market research, cold calling, networking, and outreach.
- Generate a strong pipeline of leads across media sales, events, BTL campaigns, and digital marketing domains.
Client Engagement & Relationship Management
- Develop and nurture relationships with key decision-makers and stakeholders to understand their business requirements.
- Deliver compelling pitches and presentations tailored to client needs.
Sales Execution
- Achieve business targets by converting leads into long-term clients.
- Coordinate with internal teams to create proposals, quotations, and strategic sales plans.
Market Research & Analysis
- Conduct market research to understand industry trends, competitor activities, and client needs.
- Provide insights to tailor pitches and identify new business opportunities.
Relationship Management
- Develop and nurture long-term relationships with clients to ensure repeat business.
- Serve as the point of contact, delivering exceptional client support and ensuring satisfaction throughout the project lifecycle.
Campaign Planning Support
- Collaborate with internal teams to develop customized proposals and marketing strategies.
- Assist in executing BTL campaigns, digital initiatives, and events to drive client success.
Reporting & CRM Management
- Maintain accurate client and sales data in CRM tools.
- Provide regular updates on leads, opportunities, and sales performance to the leadership team.
Hiring for the below position with a leading Pharma Engineering MNC.
Description:
Exp:2-7 years
Location: Bangalore
Good working experience in 800XA DCS
- Good working knowledge of 800xA system (Hardware/Software), PCDL, PCEL and other system libraries.
- Good understanding of ISA S88 Batch concepts.
- Should have developed Process Automation Conceptual System Architecture.
- Should have developed PCS user requirements and system hardware specifications.
- Should have developed functional specifications (Control Module, EM and Phase class) for recipes.
- Executed Module integration Tests, FAT, SAT, IQ and OQ

Requirements:
● Understanding our data sets and how to bring them together.
● Working with our engineering team to support custom solutions offered to the product development.
● Filling the gap between development, engineering and data ops.
● Creating, maintaining and documenting scripts to support ongoing custom solutions.
● Excellent organizational skills, including attention to precise details
● Strong multitasking skills and ability to work in a fast-paced environment
● 5+ years experience with Python to develop scripts.
● Know your way around RESTFUL APIs.[Able to integrate not necessary to publish]
● You are familiar with pulling and pushing files from SFTP and AWS S3.
● Experience with any Cloud solutions including GCP / AWS / OCI / Azure.
● Familiarity with SQL programming to query and transform data from relational Databases.
● Familiarity to work with Linux (and Linux work environment).
● Excellent written and verbal communication skills
● Extracting, transforming, and loading data into internal databases and Hadoop
● Optimizing our new and existing data pipelines for speed and reliability
● Deploying product build and product improvements
● Documenting and managing multiple repositories of code
● Experience with SQL and NoSQL databases (Casendra, MySQL)
● Hands-on experience in data pipelining and ETL. (Any of these frameworks/tools: Hadoop, BigQuery,
RedShift, Athena)
● Hands-on experience in AirFlow
● Understanding of best practices, common coding patterns and good practices around
● storing, partitioning, warehousing and indexing of data
● Experience in reading the data from Kafka topic (both live stream and offline)
● Experience in PySpark and Data frames
Responsibilities:
You’ll
● Collaborating across an agile team to continuously design, iterate, and develop big data systems.
● Extracting, transforming, and loading data into internal databases.
● Optimizing our new and existing data pipelines for speed and reliability.
● Deploying new products and product improvements.
● Documenting and managing multiple repositories of code.
Skills required / Roles and Responsibility
- Aware of API testing.
- Good logical reasoning. Good problem-solving skills.
- Should be able to do Automation using python/java.
- Proficient with Selenium and Java
- Must have Manual testing experience.
- Hands on experience on automation frameworks and must have experience of building them from scratch.
- Great design and problem-solving skills.
- Good to have knowledge of any Cloud and basic of DevOps.
- Proficient understanding of GIT, Linux.
- Able to analyze and clearly articulate complex issues and technologies.
- Strong debugging skills.
- Excellent communication, documentation and reporting skills.
- Ability to plan, priorities and execute in fast moving Agile sprints


- Need to develop the new script using Perl & Python
- Need to analyse the existing script and do new changes
- Need to interact with QA team, Deployment team
- Need to interact with in-house and external customers
- Need to interact with internal team members for integrated development
- Need to have good communication skills within the team members
Skills required:
- Need to have experience in developing projects using PERL and Python
- Familiarity in Unix/Linux development environments and tools including
- scripting and process management
- Need to have experience in database(Mysql) concepts
- Need to have a experience in Elastic Search
- Have a knowledge in Git,Svn commands
- Need to have experience in implementing OOPS concepts
- Need to have a experience in XML functionality (read,create.,etc)
- Have a knowledge in creating a csv,xlsx files, json format
- Need to have a experience in PDF Functionality & FTP,SFTP Modules
- Need experience in unit testing
- Develop best practices to ensure coding efficiency and quality
- Experience in test driven development and Agile methodologies
Preference of Educational background:
- B.E
Preference of Professional background:
- Experience in handling modules
- Experience in PERL,Python,Mysql,Linux,Elastic search,Xml,PDF,FTP functionality
Job Responsibilities:
This role requires you to work on Linux systems and their associated services which provide the capability for IG to run their trading platform. Team responsibilities include daily troubleshooting and resolution of incidents, operational maintenance, and support for proactive and preventative analysis of Production and Development systems.
- Managing the Linux Infrastructure and web technologies
- Patching and upgrades of Redhat Linux OS and server firmware.
- General Redhat Linux system administration and networking.
iii. Troubleshooting and Issue Resolution of OS and network stack incidents.
iv. Configurations management using puppet and version control.
v. Systems monitoring and availability.
vi. Web applications and application routing.
vii. Web-site infrastructure, content delivery, and security. - Day to day responsibilities will include: Completing service requests, responding to Incidents and Problems as they arise as well as providing day to day support and troubleshooting for Production and Development systems.
3. Create a run book of operational processes and follow a support matrix of products.
4. Ensuring Internal Handovers are completed, and all OS documentation is updated.
5. Troubleshoot system issues, plan for future capacity, and monitor systems performance.
6. Proactive monitoring of the Linux platform and ownership of these tools/dashboards.
7. Work with the delivery and engineering teams to develop the platform and technologies, striving to automate where possible.
8. Continuously improve the team, tools, and processes, support regular agile releases of applications and architectural improvements.
9. The role includes participating in a team Rota to provide out-of-hours support.
Person Specification:
Ability / Expertise
This position is suited to an engineer with at least 8 years of Redhat Linux / Centos Systems Administration experience that is looking to broaden their range of technologies and work using modern tools and techniques.
We are looking for someone with the right attitude: -
Eager to learn new technologies, tools, and techniques alongside applying their existing skills and judgment.
• Pragmatic approach to balancing different work priorities such as incidents, requests and
Page Break troubleshooting.
- Can do/Proactive in improving the environments around them.
• Sets the desired goal and the plans to achieve it.
• Proud of their achievements and keen to improve further.
This will be a busy role in a team so the successful candidate’s behaviors will need to strongly align with our values:
• Champion the client: customer service is a passion, cultivates trust, has clarity and communicates well, works with pace and momentum
• Lead the way: innovative and resilient, strong learning agility and curiosity
• Love what we do: Conscientiousness - has high self-discipline, carefulness, thoroughness, and organization, Flexible and adaptable
The successful candidate will be able to relate to the statements above and give examples that back them up. We believe that previous achievements signpost a good fit at IG.
Qualifications
Essential:
• At least 4 years’ Systems Administration experience with Redhat Enterprise Linux / Centos 5/6/7 managed through a Satellite infrastructure.
• Managed an estate of 1000+ hosts and performed general system administration, networking, backup, and restore monitoring and troubleshooting functions on that estate.
• 1 Years of experience with scripting languages (bash/Perl/Ruby) and automating tasks with Puppet and Redhat Satellite. Experience with custom RPM generation.
• Strong analytical and troubleshooting skills. You will have resolved complex systems issues in your last role and have a solid understanding of the tools needed to do so.
• Excellent Communication (Listening, speaking, the transmission of concepts with/without examples, etc).
• Calm under pressure and work to tight deadlines. You will have brought critical production systems back to life.
- BE/B.tech Computer science or related degree.
- Salesforce Platform Developer II certification is highly desired
- 4+ years of experience programming on the Salesforce platform
- Deep understanding of the Salesforce.com product suite including Sales Cloud, Platform and the App Exchange
- Extensive development experience with Apex Classes, Triggers, Visual-force, Lightning, Batch Apex, SOQL, Salesforce API's and other programmatic solutions on the Salesforce platform
- Firm understanding of advanced design patterns and engineering best practices
- Familiarity with agile software delivery methodologies such as Scrum
- Background in development of enterprise systems as part of a complete software product life-cycle
- An effective communicator that works well in a collaborative team setting
- Natural problem solver who enjoys identifying ways to improve our customer's overall experience
- Prior experience in SaaS or subscription-based companies is a plus
- Familiar with integration applications such as Boomi, Informatica and Mulesoft preferred
- Working knowledge of Web and JavaScript frameworks are a plus Self-starter, self-motivated, able to work independently, prioritize effectively, and perform multiple tasks under minimal supervision.




