


Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.
Role & Responsibilities:
Job Title: Senior Associate L1 – Data Engineering
Your role is focused on Design, Development and delivery of solutions involving:
• Data Ingestion, Integration and Transformation
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies
2.Minimum 1.5 years of experience in Big Data technologies
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
7.Cloud data specialty and other related Big data technology certifications
Job Title: Senior Associate L1 – Data Engineering
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes

Similar jobs


Responsibilities
· Understand requirements and translate them to product features.
· Participate in Scrum meetings and express the work done and the plan clearly. Participate in scrum ceremonies and clearly communicate.
- Develop applications using Front end, middleware, and database-related technologies.
- Should be hands-on in developing and implementing best practices and writing smart pieces of code.
- Coding standards should be followed, and the code should be highly performant.
- Should be able to write unit test cases using any of the frameworks and should be completely automated.
- Should have strong exposure to REST API design and principles and adhere to RAML/Swagger or Open API specification
- Should be able to do impact analysis and document the design of the components.
- Should be able to develop reusable components using proper design patterns as listed by lead/architect so that it is extensible.
The Role offers
· An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations.
· An end-to-end project exposure across multiple technical stacks and cloud platform
· An individual who has a passion for learning and adapts to new technologies quickly and scales to the next level easily.
- High visibility, the opportunity to interact with multiple groups within the organization, technology vendors, and implementation partners.
Essential Skills
· 4+ years of in-depth knowledge in Core Java, Spring DI, Spring MVC, REST, JMS, Hibernate, JDBC, PL/SQL
· 1+ years of experience in Spring boot, Angular 8 or above
· 2+ years of experience in RESTful HTTP services design
· 2+ years of experience in Javascript, JQuery, Bootstrap, HTML 5, CSS3
· 2+ years of experience with SQL Server, Postgre SQL writing stored procedures, performance tuning, and identifying deadlocks, transactions, and data locking/blocking scenarios
· Working knowledge of Webpack, CLI, and Agile Scrum framework
· Good communication and unit testing knowledge.
· Good to have knowledge in one of the cloud platforms like AWS/Azure/PCF
· Work experience in frameworks like JPA, Spring Core, Spring AOP, and Spring Data
· Familiar with Continuous Integration methodologies and tools, including Jenkins
· Good to have: Exposure to Microservices, Docker, Kubernetes, and cloud deployment
Essential Qualification
· MCA/equivalent master’s in computers is a must.


- Integration of user-facing elements developed by front-end developers with server-side logic.
- Writing reusable, testable, and efficient code.
- Design and implementation of low-latency, high-availability, and performant applications.
- Follow emerging technologies
- Test and develop software for client applications
- Proficiency in JavaScript
- Strong knowledge of JavaScript technologies such as Node.JS, React.JS, Express.JS, etc.
- Experience in a wide variety of databases such as MySQL, MongoDB, etc.
- Understand Architectural Requirements and ensure effective design and development.
- Enable absorption of latest technologies in the product line.
- Good analytical & Communication skills.
Auto cad trainer JD
We are looking for an experienced and qualified AutoCAD trainer to join our team and deliver high-quality training courses on AutoCAD software. As an Auto cad trainer, you will be responsible for designing, developing, and delivering engaging and informative training sessions for our clients, both online and in-person. You will also be expected to provide feedback and support to the learners, assess their progress, and evaluate the effectiveness of the training.
To be a successful AutoCAD trainer, you should have a solid knowledge of AutoCAD software and its applications, as well as excellent communication, presentation, and interpersonal skills. You should also have a passion for teaching and learning, and a willingness to adapt to different learning styles and needs. You should have a relevant certification or degree in AutoCAD or a related field, and at least two years of experience as an Auto cad trainer or instructor.
Qualifications
Bachelor
Location:
📍 693, Vasundhara Sector, 14-A
Ghaziabad, Uttar Pradesh - 201010,
📍 B-132,SECTOR-2 ,NOIDA-201301
SOFTCRAYONS TECH SOLUTIONS Pvt.Ltd
DEVELOPMENT | TRAINING | CONSULTANCY
*Plot No. 693, Sector-14A, Vasundhara, Ghaziabad(U.P)


Position description:
- Architecture & Design systems for Predictive analysis and writing algorithms to deal with financial data
- Must have experience on web services and APIs (REST, JSON, and similar) and creation and consumption of RESTful APIs
- Proficiently writing algorithms with Python/Pandas/Numpy; Jupyter/PyCharm
- Experience with relational and NoSQL databases (Eg. MSSQL, MongoDB, Redshift, PostgreSQL, Redis)
- Implementing Machine Learning Models using Python/R for best performance
- Working with Time Series Data & analyzing large data sets.
- Implementing financial strategies in python and generating reports to analyze the strategy results.
Primary Responsibilities:
- Writing algorithms to deal with financial data and Implementing financial strategies in (Python, SQL) and generating reports to analyze the strategy results.
Educational qualifications preferred Degree: Bachelors degree
Required Knowledge:
- Highly skilled in SQL, Python, Pandas, Numpy, Machine Learning, Predictive Modelling, Algorithm designing, OOPS concepts
- 2 - 7 years Full-Time working experience on core SQL, Python role (Non-Support)
- Bachelor’s Degree in Engineering, equivalent or higher education.
- Writing algorithms to deal with financial data and Implementing financial strategies in (Python, SQL) and generating reports to analyze the strategy results.
- 6+ years of experience working with MongoDB or other NoSQL databases.
- Maintain and configure MongoDB (developer)
- Keep clear documentation of the database setup and architecture.
- Backup and Disaster Recovery management.
- Adept with all the best practices and design patterns in MongoDB for designing document schemas.
- Good grasp of MongoDB’s aggregation framework.
- Ensure that the databases achieve maximum performance and availability.
- Design indexing strategies.
- Configure, monitor, and deploy replica sets.
- Should have experience with MongoDB Atlas.
- Should have minimum experience with development and performance tuning.
- Create roles and users and set their permissions.
- Excellent written, verbal communication skills and critical thinking skills
Skills Required (Knowledge and Skills)
SAP Fiori UI5
OData service
HTML5
CSS3
Java Script
Angular, Node (Not Mandatory, Added advantage)



