11+ Service delivery Jobs in Hyderabad | Service delivery Job openings in Hyderabad
Apply to 11+ Service delivery Jobs in Hyderabad on CutShort.io. Explore the latest Service delivery Job opportunities across top companies like Google, Amazon & Adobe.
👋🏼We're Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (19000+ experts across 33 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in.
REQUIREMENTS:
- Must Have: Salesforce Workflow
- Should have strong technical background with Java (Strong), Microservices architecture (MSA) (Strong), DevOps, AWS, Java, Cloud, Micro Services, DevOps
- Sound knowledge of project management, agile and software development practices
- Ability to communicate effectively in both technical and non-technical terms
- Well versed with PM tools essentially MS Project and MS Excel
- Experience in frontend and mobile apps
- Ability to conduct code reviews and ensure quality compliance
- Financials Management dealt with PL of various accounts
- Design and develop complex mappings, performance tuning of mappings, Process Flows
- Hands on experience on System designing, Event sourcing and message broker architecture
- Requirement gathering and analysis
- Define, understand, and analyse non-functional requirements for project
- Lead the technical team
- Should be able to compare technologies to find best fit for project requirements
- Should be able to troubleshoot complex or unusual bugs
RESPONSIBILITIES:
- Ensuring client satisfaction above all
- Showcasing a consulting mindset by acting as a solution provider rather than an order taker
- Identifying project/service stakeholders at an early stage and working with them to ensure that the deliverables are in sync with the benefits defined in the business case.
- Planning, organizing, and monitoring the project to deliver high quality business solutions.
- Defining the scope of the project/service, managing goals, risks, issues, and resources throughout the project lifecycle.
- Mentoring and managing team members, by giving constant on the job feedback, and by providing guidance
- Ensuring project quality of work meets defined governance, process standards and best practices.
- Reporting the status of all key metrics (eg: risk, scope, schedule, quality, customer satisfaction) from inception through closure
- Assisting the account management team in responding to new project requests
- Identifying opportunities in the current engagement to cross sell or up sell Nagarro’s offerings.
Data Engineer
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
- AWS certification
- Spark Streaming
- Kafka Streaming / Kafka Connect
- ELK Stack
- Cassandra / MongoDB
- CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
Description:
We are seeking an experienced SAP BRIM - SOM consultant to join our team. The ideal candidate will have a comprehensive understanding of the SAP BRIM - Subscription Order Management (SOM) module along with a strong background in the design, configuration, and support of SAP systems.
Responsibilities:
1. Gather business requirement and implement the SAP BRIM - SOM module based on specific business requirements.
2. Develop end-to-end solutions for the SAP BRIM - SOM platform. This includes creating functional specifications, working with technical teams to develop solutions, and conducting unit and integration testing.
Position: Sr. Java Developer
Qualification: BE, B. Tech, MCA, M. Tech
Experience: 6+ Years
Skills:
Java 8+, Microservices, Spring Boot, Nosql, Kubernetes, AWS, REST
Job Role and Responsibilities:
- Bachelor’s degree in Computer Science or equivalent
- Min 6 years of development experience in Spring Boot & Microservices architecture
- Working experience with Java, Spring Boot, Hibernate, MySQL, Spring Data JPA, AWS, Spring Cloud, Maven, GitHub, Swagger, Eureka, Zuul etc.
- Strong understanding of API Security for Microservices (Spring Security, OAuth 2.0, JWT)
- Strong understanding of web development cycle and programming techniques and tools
- Ability to work independently or with group
- Strong understanding of modern development methodologies and best industry standards
- Experience working with APIs, RESTful APIs & Microservices
- Experience building highly scalable applications using Redis, Kafka, Akka, gRPC
- Mentor, Train 3-4 engineers
- AWS knowledge is a plus
- Strong knowledge on SQL and NoSQL Databases
- Should have worked on developing large scale products and services
- Knowledge on Agile processes is a must
- Able to work with multiple distributed teams
Good knowledge of Python frameworks such as Django, CherryPy, etc.
Good understanding of REST API
Experience with JavaScript, jQuery, HTML and CSS
Build back-end features in Python that are efficient.
Integrate front-end and back-end components into the application
Develop integrations with third party applications (mostly web-based).
Working Knowledge of SQL and databases.
Bachelor’s Degree in Computer Science Engineering or other related fields.
Understand the needs of the client and Implement functional requirements accordingly.
Agile development methodology
Good communication (verbal and written)

Global sports subscription streaming platform.
At least 7 years programming experience, some of them in JS/TypeScript.
Proven expert level knowledge in BE & FE development (NodeJS, React,
electron, npm/yarn, WebPack).
Experience with cloud (preferably AWS) and/or micro-services.
Proven experience mentoring junior developers, defining work procedures and coding conventions.
Experience working with various databases (preferably more than two).
Location: Pune/Nagpur,Goa,Hyderabad/
Job Requirements:
- 9 years and above of total experience preferably in bigdata space.
- Creating spark applications using Scala to process data.
- Experience in scheduling and troubleshooting/debugging Spark jobs in steps.
- Experience in spark job performance tuning and optimizations.
- Should have experience in processing data using Kafka/Pyhton.
- Individual should have experience and understanding in configuring Kafka topics to optimize the performance.
- Should be proficient in writing SQL queries to process data in Data Warehouse.
- Hands on experience in working with Linux commands to troubleshoot/debug issues and creating shell scripts to automate tasks.
- Experience on AWS services like EMR.



