11+ SAP RFC Jobs in Chennai | SAP RFC Job openings in Chennai
Apply to 11+ SAP RFC Jobs in Chennai on CutShort.io. Explore the latest SAP RFC Job opportunities across top companies like Google, Amazon & Adobe.

Job Description
SAP HR Technical Lead support Consultant will analyze and propose technical solutions for
Production issues raised as incidents and work towards closure. Work closely with Technical Team
and Functional Consultants. Provide effort estimation for any break fixes that are needed,
feasibility study, unit testing, technical review, and mentor/guide the technical team.
Essential Functions:
• Must be able to work on rotational shifts.
• Must have experience on 2-3 SAP Payroll implementations
• SAP HR ABAP development or support experience. Must have experience and expert knowledge
of ABAP development objects including Dialog programming, User-exits, BADIs, Smart Forms
and/or Adobe Forms, ALV, BAPIs, RFCs, and other SAP development tools.
• Must have worked on payroll implementation projects.
• Must have good experience in SAP HR ABAP and SAP UI5/Fiori applications using SAP Web IDE.
• Must have hands-on experience and technical proficiency in Object Oriented ABAP (OOPs) with
expertise in WRICEF development.
• Must have strong ABAP debugging skills
• Manage the support team of Functional associates, Technical associates and Scrum masters
focused on delivering and maintaining products for the Human resources domain.
• Work with solution architects and help define business processes and system architecture
• Support the team in ensuring that the Incidents, production bugs are resolved within the given
SLA
• Ensure that RCA is done and the systems (HRO/payroll) are stabilized
• Ensure compliance with security practices/guidelines and relevant technology standards
• Experience and worked on HR Interfaces.
• Support experience in SAP and the cross-function relationship between modules for forms,
reports, interfaces, enhancements, BDC, LSMW, BADI, BAPI, workflow, and other development
work
• Strong knowledge and understanding of SAP Transport layers, IDOCs and interfaces and the
different kinds of Transport Request and SAP best practices in handling transports
• Understanding of all the key enhancements spots (BADIs and User Exits) and how to best
leverage them to meet customer-specific requirements with minimal risk
• Exemplary troubleshooting skills and ability to drive root cause analysis on incidents and
problems
• Must have good experience in HR ABAP Programming. Extensively worked on PA, OM, Time &
Payroll.
• Must have good experience in SAP ESS/MSS modules of HR and all submodules of HR with
specialization in Payroll. Also, worked on PCRs in HR.
• Must have good knowledge on configuring and customizing various HCM Modules i.e. Personnel
Administration, Organizational Management, Time Management, Payroll, ESS/MSS
Required Education and Experience:
• Bachelor’s Degree or equivalent experience
• 8-12 Years of SAP ABAP/HR ABAP experience
• Strong application support experience for a large client
Preferred Education and Experience:
• SAP payroll technical knowledge
• Excellent written and verbal communication skills
• Good logical and analytical skills
• Ability to work independently and guide the technical team
• Worked on ticket handling tool service now.
Additional Eligibility Qualifications: An ideal candidate will have a minimum of 2-3 SAP HCM
Payroll support projects and 8-12 years of experience as SAP HCM ABAP developer.
send cv to vinay.sharmaatcodersbrain.com
Bangalore / Chennai
- Hands-on data modelling for OLTP and OLAP systems
- In-depth knowledge of Conceptual, Logical and Physical data modelling
- Strong understanding of Indexing, partitioning, data sharding, with practical experience of having done the same
- Strong understanding of variables impacting database performance for near-real-time reporting and application interaction.
- Should have working experience on at least one data modelling tool, preferably DBSchema, Erwin
- Good understanding of GCP databases like AlloyDB, CloudSQL, and BigQuery.
- People with functional knowledge of the mutual fund industry will be a plus
Role & Responsibilities:
● Work with business users and other stakeholders to understand business processes.
● Ability to design and implement Dimensional and Fact tables
● Identify and implement data transformation/cleansing requirements
● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse
● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical data definitions
● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique
● Provide research, high-level design, and estimates for data transformation and data integration from source applications to end-user BI solutions.
● Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use.
● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate, design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality.
● Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions.
● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into reporting & analytics.
● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.
● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements.
● Train business end-users, IT analysts, and developers.
Required Skills:
● Bachelor’s degree in Computer Science or similar field or equivalent work experience.
● 5+ years of experience on Data Warehousing, Data Engineering or Data Integration projects.
● Expert with data warehousing concepts, strategies, and tools.
● Strong SQL background.
● Strong knowledge of relational databases like SQL Server, PostgreSQL, MySQL.
● Strong experience in GCP & Google BigQuery, Cloud SQL, Composer (Airflow), Dataflow, Dataproc, Cloud Function and GCS
● Good to have knowledge on SQL Server Reporting Services (SSRS), and SQL Server Integration Services (SSIS).
● Knowledge of AWS and Azure Cloud is a plus.
● Experience in Informatica Power exchange for Mainframe, Salesforce, and other new-age data sources.
● Experience in integration using APIs, XML, JSONs etc.
Job Summary:
We are seeking a seasoned Java Backend Developer Lead to drive the design, development, and deployment of robust backend systems. This role requires hands-on coding expertise, architectural vision, and the ability to mentor and lead a team of developers in delivering high-performance, scalable applications.
Key Responsibilities:
- Lead the backend development team in designing and implementing microservices-based architectures using Java (preferably Java 11+).
- Architect and develop RESTful APIs and backend services using Spring Boot, Hibernate, and related frameworks.
- Collaborate with DevOps teams to streamline CI/CD pipelines using Jenkins, Docker, and Kubernetes.
- Ensure code quality through code reviews, unit testing (JUnit/TestNG), and integration testing.
- Optimize application performance and scalability through profiling and tuning.
- Guide team members in adopting best practices in software engineering, including Agile methodologies.
- Work closely with product managers, frontend developers, and QA teams to deliver end-to-end solutions.
- Maintain documentation and ensure knowledge sharing across the team.
Required Skills:
- Strong proficiency in Java, Spring Boot, and REST API development.
- Experience with relational databases (MySQL, PostgreSQL) and NoSQL (MongoDB, Redis).
- Familiarity with messaging systems like Kafka or RabbitMQ.
- Solid understanding of cloud platforms (AWS, Azure, or GCP).
- Experience with containerization and orchestration tools (Docker, Kubernetes).
- Excellent problem-solving, debugging, and analytical skills.
- Strong leadership and communication abilities.
Preferred Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
- 7+ years of backend development experience, with at least 2 years in a lead role.
- Exposure to security best practices and performance testing tools like JMeter.


Proficient in Golang, Python, Java, C++, or Ruby (at least one)
Strong grasp of system design, data structures, and algorithms
Experience with RESTful APIs, relational and NoSQL databases
Proven ability to mentor developers and drive quality delivery
Track record of building high-performance, scalable systems
Excellent communication and problem-solving skills
Experience in consulting or contractor roles is a plus
Mandatory
Exposure to the Trading and financial investment banking domain
All technical competencies are mandatory, and no relaxations/exceptions can be given.
Technical skills
Cloud computing (IaaS, PaaS, SaaS) -preferably in AWS
Compute/Container/Orchestration technologies (Docker, Kubernetes, ECS, EKS, Lambda/Serverless etc.)
Microservices & multi-tier architectures
DevOps/CI/CD (GIT/Bitbucket, Maven, Gradle, Jenkins, Sonar) GitLab is preferable
Java (Core & EE, Spring Boot, Spring MVC, Spring Cloud) & Python
RDBMS systems sucah as MySQL , Postgres and NoSQL, Storage Technologies (S3, EBS etc.)
API – GraphQL, REST, API Gateway
Integration and events/messaging technologies (Kafka, RabbitmQ, SNS, SQS)
Caching Solutions such as Elasticache/Redis, Hazelcast, EHCache
Observability and monitoring (Dynatrace, Cloud Watch, Grafana, Splunk, Datadog)
Very good understanding on Agile software releases and hands-on
Experience in project management tools like JIRA
Educational Qualification
Bachelor’s degree in computer science engineering/ECE/EEE, IT or MCA, MSc Computer Science
Research industry-related topics (combining online sources and studies)
Write clear marketing copy to promote our products/services
Prepare well-structured drafts using Content Management Systems
Proofread and edit blog posts before publication
Coordinate with marketing and design teams to illustrate articles
Conduct simple keyword research and use SEO guidelines to increase web traffic
Promote content on social media
Identify customers’ needs and gaps in our content and recommend new topics
Ensure all-around consistency (style, fonts, images, and tone)
Should be able to write content in Tamil and English
Immediate joiners



Looking for a Web Developer with good hands-on Experience in Laravel.
- Candidates have good Team Interactions.
Experience : 1 to 2 years
Responsibilities :
- Strong coding knowledge in Laravel Framework.
- Responsible for handling projects independently as well as collaboratively.
- Should be capable of handling multiple PHP projects.
- Analyse scope of work and time frame accordingly.
- Should have good analytical and problem solving skills.
- Ability to motivate teammates.
- Mentor and lead a team of junior developers.
- Provide technical guidance to development team members’
Key Skills:
Php, Laravel, MySQL, JavaScript, jQuery, HTML, CSS,DB
Added advantage to have : Code igniter, Angular
Job Types: Full-time, Walk-In preferred:immediate joinee
We CondéNast are looking for a Support engineer Level 2 who would be responsible for
monitoring and maintaining the production systems to ensure the business continuity is
maintained. Your Responsibilities would also include prompt communication to business
and internal teams about process delays, stability, issue, resolutions.
Primary Responsibilities
● 5+ years experience in Production support
● The Support Data Engineer is responsible for monitoring of the data pipelines
that are in production.
● Level 3 support activities - Analysing issue, debug programs & Jobs, bug fix
● The position will contribute to the monitoring, rerun or reschedule, code fix
of pipelines for a variety of projects on a daily basis.
● Escalate failures to Data-Team/DevOps incase of Infrastructure Failures or unable
to revive the data-pipelines.
● Ensure accurate alerts are raised incase of pipeline failures and corresponding
stakeholders (Business/Data Teams) are notified about the same within the
agreed upon SLAs.
● Prepare and present success/failure metrics by accurately logging the
monitoring stats.
● Able to work in shifts to provide overlap with US Business teams
● Other duties as requested or assigned.
Desired Skills & Qualification
● Have Strong working knowledge of Pyspark, Informatica, SQL(PRESTO), Batch
Handling through schedulers(databricks, Astronomer will be an
advantage),AWS-S3, SQL, Airflow and Hive/Presto
● Have basic knowledge on Shell scripts and/or Bash commands.
● Able to execute queries in Databases and produce outputs.
● Able to understand and execute the steps provided by Data-Team to
revive data-pipelines.
● Strong verbal, written communication skills and strong interpersonal
skills.
● Graduate/Diploma in computer science or information technology.
About Condé Nast
CONDÉ NAST GLOBAL
Condé Nast is a global media house with over a century of distinguished publishing
history. With a portfolio of iconic brands like Vogue, GQ, Vanity Fair, The New Yorker and
Bon Appétit, we at Condé Nast aim to tell powerful, compelling stories of communities,
culture and the contemporary world. Our operations are headquartered in New York and
London, with colleagues and collaborators in 32 markets across the world, including
France, Germany, India, China, Japan, Spain, Italy, Russia, Mexico, and Latin America.
Condé Nast has been raising the industry standards and setting records for excellence in
the publishing space. Today, our brands reach over 1 billion people in print, online, video,
and social media.
CONDÉ NAST INDIA (DATA)
Over the years, Condé Nast successfully expanded and diversified into digital, TV, and
social platforms - in other words, a staggering amount of user data. Condé Nast made the
right move to invest heavily in understanding this data and formed a whole new Data
team entirely dedicated to data processing, engineering, analytics, and visualization. This
team helps drive engagement, fuel process innovation, further content enrichment, and
increase market revenue. The Data team aimed to create a company culture where data
was the common language and facilitate an environment where insights shared in
real-time could improve performance. The Global Data team operates out of Los Angeles,
New York, Chennai, and London. The team at Condé Nast Chennai works extensively with
data to amplify its brands' digital capabilities and boost online revenue. We are broadly
divided into four groups, Data Intelligence, Data Engineering, Data Science, and
Operations (including Product and Marketing Ops, Client Services) along with Data
Strategy and monetization. The teams built capabilities and products to create
data-driven solutions for better audience engagement.
What we look forward to:
We want to welcome bright, new minds into our midst and work together to create
diverse forms of self-expression. At Condé Nast, we encourage the imaginative and
celebrate the extraordinary. We are a media company for the future, with a remarkable
past. We are Condé Nast, and It Starts Here.
¡ Solid Design and Architecture skills. Experience in the designing, development, and deployment of large-scale enterprise applications with multiple tiers. |
¡ Ensure agile test-driven development for robustness, usability, reliability, security and performance. |
¡ Expert in Core JAVA and Spring Boot and other Spring libraries, Eureka, Hystrix etc. |
¡ Experience creating architecture or developing web services to integrate applications with databases such as Oracle, MySQL, MongoDB, or Cassandra |
¡ Solid understanding of OOPS, algorithms and data structures. |
¡ Experience on Kafka / Scala / Storm / Elastic Search and Web Services like RESTful / SOAP. |
¡ Extensive experience with version control systems (Git is preferred) and issue tracking systems (JIRA is preferred) |



Senior Engineer – Artificial Intelligence / Computer Vision
(Business Unit – Autonomous Vehicles & Automotive - AVA)
We are seeking an exceptional, experienced senior engineer with deep expertise in Computer Vision, Neural Networks, 3D Scene Understanding and Sensor Data Processing. The expectation is to lead a growing team of engineers to help them build and deliver customized solutions for our clients. A solid engineering as well as team management background is a must.
About MulticoreWare Inc
MulticoreWare Inc is a software and solutions development company with top-notch talent and skill in a variety of micro-architectures, including multi-thread, multi-core, and heterogeneous hardware platforms. It works in sectors including High Performance Computing (HPC), Media & AI Analytics, Video Solutions, Autonomous Vehicle and Automotive software, all of which are rapidly expanding. The Autonomous Vehicles & Automotive business unit specializes in delivering optimized solutions for sophisticated sensor fusion intelligence and the design of algorithms & implementation of software to be deployed on a variety of automotive grade hardware platforms.
Role Responsibilities
● Lead a team to solve the problems in a perception / autonomous-systems scope and turn ideas into code & products
● Drive all technical elements of development, such as project requirements definition, design, implementation, unit testing, integration, and software delivery
● Implementing cutting edge AI solutions on embedded platforms and optimizing them for performance. Hardware architecture aware algorithm design and development
● Contribute to the vision and long-term strategy of the business unit
Required Qualifications (Must Have)
● 3 - 7 years of experience with real world system building, including design, coding (C++/Python) and evaluation/testing (C++/Python)
● Solid experience in 2D / 3D Computer Vision algorithms, Machine Learning and Deep Learning fundamentals – Theory & Practice. Hands-on experience with Deep Learning frameworks like Caffe, TensorFlow or PyTorch
● Expert level knowledge in any of the courses related Signal Data Processing / Autonomous or Robotics software development (Perception, Localization, Prediction, Planning), multi-object tracking, sensor fusion algorithms and familiarity on Kalman filters, particle filters, clustering methods etc.
● Good project management and execution capabilities, as well as good communication and coordination ability
● Bachelor’s degree in Computer Science, Computer Engineering, Electrical Engineering, or related fields
Preferred Qualifications (Nice-to-Have)
● GPU architecture and CUDA programming experience, as well as knowledge of AI inference optimization using Quantization, Compression (or) Model Pruning
● Track record of research excellence with prior publication on top-tier conferences and journals
Must Have Requirements
1. Fluency In English And Tamil.
2. Sales Experience In Selling To Industries, Small And Medium Businesses In Industrial Areas Dealing Involved In - Manufacturing, Garments, Pharma Etc
3. Experience In Working With Channel Partners And Agents.
Nice To Have Requirements - B2B Sales In Financial Services, Insurance, Software Etc.