

The Merck Data Engineering Team is responsible for designing, developing, testing, and supporting automated end-to-end data pipelines and applications on Merck’s data management and global analytics platform (Palantir Foundry, Hadoop, AWS and other components).
The Foundry platform comprises multiple different technology stacks, which are hosted on Amazon Web Services (AWS) infrastructure or on-premise Merck’s own data centers. Developing pipelines and applications on Foundry requires:
• Proficiency in SQL / Java / Python (Python required; all 3 not necessary)
• Proficiency in PySpark for distributed computation
• Familiarity with Postgres and ElasticSearch
• Familiarity with HTML, CSS, and JavaScript and basic design/visual competency
• Familiarity with common databases (e.g. JDBC, mySQL, Microsoft SQL). Not all types required
This position will be project based and may work across multiple smaller projects or a single large project utilizing an agile project methodology.
Roles & Responsibilities:
• Develop data pipelines by ingesting various data sources – structured and un-structured – into Palantir Foundry
• Participate in end to end project lifecycle, from requirements analysis to go-live and operations of an application
• Acts as business analyst for developing requirements for Foundry pipelines
• Review code developed by other data engineers and check against platform-specific standards, cross-cutting concerns, coding and configuration standards and functional specification of the pipeline
• Document technical work in a professional and transparent way. Create high quality technical documentation
• Work out the best possible balance between technical feasibility and business requirements (the latter can be quite strict)
• Deploy applications on Foundry platform infrastructure with clearly defined checks
• Implementation of changes and bug fixes via Merck's change management framework and according to system engineering practices (additional training will be provided)
• DevOps project setup following Agile principles (e.g. Scrum)
• Besides working on projects, act as third level support for critical applications; analyze and resolve complex incidents/problems. Debug problems across a full stack of Foundry and code based on Python, Pyspark, and Java
• Work closely with business users, data scientists/analysts to design physical data models

About Merck Group
About
Similar jobs
- Job Title :Purchase Engineer
- Experience: 4yr to 6yr
- Education: Degree in Mechanical / Electrical Engineering
- Location: Rakanpur, Gandhinagar
Requiredskills
- Experience in Specifically machine shop items for machines (like VMC, CNC) and if possible foundry knowledge will be added advantage.
- Experience in an engineering, procurement and operations.
- Effective verbal and written communication skills.
- Proficiency with personal computers and networking,specifically Microsoft Excel. Knowledge and direct experience in negotiating contracts and quotations with Contract Manufacturers and Suppliers.
- Ability to identify cost reduction opportunities.
ob Description:
- Cold Calling.
- Pitching to the channel partners for the site visit.
- Maintaining the database of the channel partners regularly.
Key Skills:
- Good verbal communication skills in English and Hindi.
- Computer Literate.


Year: 2+ years
CTC: As per the company standards.
Notice Period: Immediate to 30 days/
Location: Bangalore(No WFH available)
Skills: Dot net with WPF/Winforms.
MYDESIGNATION, Kerala's favourite youngster fashion brand is looking for new teammates to join our sales force. Good communication skills and previous sales experience will be an advantage.
Freshers are also welcome aboard
Salary: 212, 006 to 220, 000 based on experience
Location: Lulu Mall & Mall of Travancore Trivandrum
Taking ownership of customer issues reported and seeing problems through to
resolution
Understand, interpret, reproduce, and diagnose issues reported by the customers.
Researching, troubleshooting and identifying solutions to resolve product issues
Should be able to handle voice calls, emails and online web sessions with the
customer as part of technical troubleshooting
Should exhibit patience and empathy while working with customer with an aim to
drive positive customer experience
Following standard procedures for proper escalation of unresolved issues to the
appropriate internal teams
Following standard procedures for submitting tickets to the engineering team for
further investigation of unresolved issues
Contributing actively towards knowledge base articles
Adherence to strict SLA’s
Ready to work in early morning, general and afternoon shifts, including weekends, on
rotation basis
Should demonstrate an aptitude and appetite for learning newer technologies while
expanding on the core knowledge
Primary skills:
1-3;3-8 years of relevant experience.
Strong technical knowledge on:
Cloud Technologies – AWS, Azure etc.
Databases – SQL, Oracle, MySQL etc.
Operating Systems – Linux, Windows etc.
Networking – Basic networking concepts and troubleshooting
Programming knowledge – Java, Python (Desirable, not a must has) o Prior experience
with REST and SOAP calls.
Excellent communication skills – English written and verbal
HTTP technology and principles, including REST and SOAP principles (Required)
JSON & XML Data formats (required)
Javascript Regular Expression (Good to have)
Good understanding of networking protocols and applications (TCP/IP, proxies, load
balancing, firewalls, etc.) (Required)
Working knowledge of database technologies and SQL (Required)
In-depth familiarity of Linux (Required) (advanced user; sysadmin experience a bonus,
but not required)
Strong analytical and logical reasoning for technical troubleshooting
Ability to collaborate with cross-functional teams – Dev, QA, Infrastructure teams etc.
Should be a team player who keeps team’s success before individual achievements
Knowledge on data integration (Informatica/ Mulesoft))

Software Engineer – C++ (3-6 years of experience)
1. Telecom/Volte LTE 2g 3g Preferred
2. Programming knowledge of multi-threading, sockets, IPCs.
3. Well versed with std and boost libraries.
4. Working knowledge of GNU compilers, optimization techniques on Unix/Linux based systems.
5. Proficient in debugging tools like GDB/Valgrind and profiling tools like oprofile.
6. Knowledge of Diameter (AAA) Stack
- Development experience of communication protocol stacks
- Hands on experience in multi-threaded design techniques and implementation
- Good hands-on experience on data structures and algorithms
Requirements
- A graphic design qualification or similar
- Portfolio with design projects
- Proven work experience as a graphic designer
- Working experience with image design tools (e.g. Photoshop and Adobe Illustrator)
- A keen eye for visual details
- Aesthetic skills
- Ability to meet deadlines and collaborate with team members
Benefits
- Insurance P&C and Specialty domain experience a plus
- Experience in a cloud-based architecture preferred, such as Databricks, Azure Data Lake, Azure Data Factory, etc.
- Strong understanding of ETL fundamentals and solutions. Should be proficient in writing advanced / complex SQL, expertise in performance tuning and optimization of SQL queries required.
- Strong experience in Python/PySpark and Spark SQL
- Experience in troubleshooting data issues, analyzing end to end data pipelines, and working with various teams in resolving issues and solving complex problems.
- Strong experience developing Spark applications using PySpark and SQL for data extraction, transformation, and aggregation from multiple formats for analyzing & transforming the data to uncover insights and actionable intelligence for internal and external use



