
Tata Consultancy Services
https://tcs.comJobs at Tata Consultancy Services
RPA Power Automate Developer
Overview
Seeking an experienced RPA Developer specializing in Microsoft Power Automate to design, build, and optimize automation workflows that improve operational efficiency across business processes.
Key Responsibilities
Develop, test, and deploy automation solutions using Power Automate (Cloud & Desktop).
Analyze business processes, identify automation opportunities, and create scalable workflows.
Integrate Power Automate with SharePoint, Power Apps, Dataverse, Outlook, Teams, and external systems via connectors/APIs.
Maintain, monitor, and troubleshoot existing automations to ensure high reliability.
Collaborate with business stakeholders to gather requirements and document technical solutions.
Ensure governance, security, and compliance standards in automation development.
Required Skills & Experience
5–9 years of experience in RPA with strong expertise in Power Automate.
Hands-on experience with Power Automate Desktop, RPA flows, cloud flows, and custom connectors.
Solid understanding of REST APIs, JSON, OData, and workflow logic.
Experience with Power Platform components (Power Apps, Power BI, Dataverse) is a plus.
Strong analytical, problem‑solving, and documentation skills.
Preferred Qualifications
Microsoft Power Platform certification (e.g., PL‑500, PL‑400).
Experience with SQL, SharePoint, Azure services, or related automation tools.
• Knowledge on Routing and switching Technologies like IGP, OSPF, EGP, BGP, STP, RSTP, VTP, VSS, vPC, vDC, MSTP, LACP, VLAN
• Knowledge on Data Center, Cloud and Virtualization industry, ACI, Nexus products
• Experienced Knowledge on LAN SW Technologies like STP, RSTP, VTP, VSS, vPC, vDC, MSTP, LACP, VLAN, VXLAN-EVPN, DCNM, OTV, Fex, Fabricpath, VXlan, LISP.
• Knowledge of Cisco switching Platforms like Cat 6500, 6800, 4500, 3850, Nexus 7K, Nexus 5k, Nexus 3K and Nexus9k (standalone and ACI)
• Experience in installation, configuration, testing and troubleshooting and/or Network solution designing of Cisco routers and switch
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Key Responsibilities:
- Design, develop, and implement websites and components within Adobe Experience Manager (AEM).
- Develop and customize AEM components, templates, dialogs, workflows, servlets, and OSGi services.
- Integrate AEM with external systems and third-party services (CRM, DAM, Analytics, etc.).
- Implement responsive front-end solutions using HTML5, CSS3, JavaScript, ReactJS/Angular, and Sightly/HTL.
- Work closely with AEM Architects to ensure technical alignment and best practices.
- Optimize performance of AEM pages and ensure adherence to SEO and accessibility standards.
- Manage AEM environments, deployments, and configurations.
- Provide technical guidance and mentorship to junior developers.
- Collaborate with QA and DevOps teams for CI/CD automation and deployment pipelines.
- Troubleshoot and resolve AEM production issues.
Job Title: PySpark/Scala Developer
Functional Skills: Experience in Credit Risk/Regulatory risk domain
Technical Skills: Spark ,PySpark, Python, Hive, Scala, MapReduce, Unix shell scripting
Good to Have Skills: Exposure to Machine Learning Techniques
Job Description:
5+ Years of experience with Developing/Fine tuning and implementing programs/applications
Using Python/PySpark/Scala on Big Data/Hadoop Platform.
Roles and Responsibilities:
a) Work with a Leading Bank’s Risk Management team on specific projects/requirements pertaining to risk Models in
consumer and wholesale banking
b) Enhance Machine Learning Models using PySpark or Scala
c) Work with Data Scientists to Build ML Models based on Business Requirements and Follow ML Cycle to Deploy them all
the way to Production Environment
d) Participate Feature Engineering, Training Models, Scoring and retraining
e) Architect Data Pipeline and Automate Data Ingestion and Model Jobs
Skills and competencies:
Required:
· Strong analytical skills in conducting sophisticated statistical analysis using bureau/vendor data, customer performance
Data and macro-economic data to solve business problems.
· Working experience in languages PySpark & Scala to develop code to validate and implement models and codes in
Credit Risk/Banking
· Experience with distributed systems such as Hadoop/MapReduce, Spark, streaming data processing, cloud architecture.
- Familiarity with machine learning frameworks and libraries (like scikit-learn, SparkML, tensorflow, pytorch etc.
- Experience in systems integration, web services, batch processing
- Experience in migrating codes to PySpark/Scala is big Plus
- The ability to act as liaison conveying information needs of the business to IT and data constraints to the business
applies equal conveyance regarding business strategy and IT strategy, business processes and work flow
· Flexibility in approach and thought process
· Attitude to learn and comprehend the periodical changes in the regulatory requirement as per FED
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Desired Competencies (Technical/Behavioral Competency)
Must-Have
· Strong understanding of Kafka concepts, including topics, partitions, consumers, producers, and security.
· Experience with testing Kafka Connect, Kafka Streams, and other Kafka ecosystem components.
· API Testing Experience
· X-RAY and Test Automation Experience
· Expertise with Postman/SOAP
· Agile/JIRA/Confluence
· Strong familiarity such as XML, JSON, CSV, Avro, etc.
· Strong hands-on SQL, Mongo.
· Continuous integration and automated testing.
· Working knowledge and experience of Git.
Good-to-Have
· Troubleshoot Kafka related issues, Strong in Kafka client configuration and troubleshooting
SN
Responsibility of / Expectations from the Role
1
Engage with the customer to understand the requirements, provide technical solutions, provide value added suggestions
2
Help build and manage the team of Kafka and Java developers in the near future.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Skills and competencies:
Required:
· Strong analytical skills in conducting sophisticated statistical analysis using bureau/vendor data, customer performance
Data and macro-economic data to solve business problems.
· Working experience in languages PySpark & Scala to develop code to validate and implement models and codes in
Credit Risk/Banking
· Experience with distributed systems such as Hadoop/MapReduce, Spark, streaming data processing, cloud architecture.
- Familiarity with machine learning frameworks and libraries (like scikit-learn, SparkML, tensorflow, pytorch etc.
- Experience in systems integration, web services, batch processing
- Experience in migrating codes to PySpark/Scala is big Plus
- The ability to act as liaison conveying information needs of the business to IT and data constraints to the business
applies equal conveyance regarding business strategy and IT strategy, business processes and work flow
· Flexibility in approach and thought process
· Attitude to learn and comprehend the periodical changes in the regulatory requirement as per FED
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
• Technical expertise in the area of development of Master Data Management, data extraction, transformation, and load (ETL) applications, big data using existing and emerging technology platforms and cloud architecture
• Functions as lead developer• Support System Analysis, Technical/Data design, development, unit testing, and oversee end-to-end data solution.
• Technical SME in Master Data Management application, ETL, big data and cloud technologies
• Collaborate with IT teams to ensure technical designs and implementations account for requirements, standards, and best practices
• Performance tuning of end-to-end MDM, database, ETL, Big data processes or in the source/target database endpoints as needed.
• Mentor and advise junior members of team to provide guidance.
• Perform a technical lead and solution lead role for a team of onshore and offshore developers
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Responsibility of / Expectations from the Role
Assist in the design and implementation of Snowflake-based analytics solution (data lake and data warehouse) on Azure.
Profound experience in designing and developing data integration solutions using ETL tools such as DBT.
Hands-on experience in the implementation of cloud data warehouses using Snowflake & Azure Data Factory
Solid MS SQL Server skills including reporting experience.
Work closely with product managers and engineers to design, implement, test, and continually improve scalable data solutions and services running on DBT & Snowflake cloud platforms.
Implement critical and non-critical system data integration and ingestion fixes for the data platform and environment.
Ensuring root cause resolution to identified problems.
Monitor and support the Data Solutions jobs and processes to meet the daily SLA.
Analyze the current analytics environment and make recommendations for appropriate data warehouse modernization and migration to the cloud.
Develop Snowflake deployment (Using Azure DevOPS or similar CI/CD tool) and usage best practices.
Follow best practices and standards around data governance, security and privacy.
Comfortable working in a fast-paced team environment coordinating multiple projects.
Effective software development life cycle management skills and experience with GitHub
Leverage tools like Fivetran, DBT, Snowflake, GitHub, to drive ETL, data modeling and analytics.
Data transformation and Data Analytics Documentation
Similar companies
About the company
Jobs
3
About the company
Certa’s no-code platform makes it easy to digitize and manage the lifecycle of all your suppliers, partners, and customers. With automated onboarding, contract lifecycle management, and ESG management, Certa eliminates the procurement bottleneck and allows companies to onboard third-parties 3x faster.
Jobs
5
About the company
About company
Geotrackers, is a technology company offering end to end solutions that help organizations manage their field resources more effectively, be they vehicles, assets or personnel. We provide GPS based vehicle tracking solutions and mobile solutions for field force management. Our solutions have gained in popularity owing to the multiple benefits that they offer, starting with increase in productivity of the field resources, reduction in costs of field operations, better customer service and better safety & security for man & material. All our solutions are cloud based, offered on the SaaS model. They are therefore easy to deploy and economical to use. Our primary targets are organizations with a sizeable fleet of vehicle or sales & service personnel. Our solutions are used by over 1000 organizations across industry sectors like logistics, transportation, food processing, mining, cash logistics and many other industries, such as TATA Steel, ICICI Bank, DTDC, IOCL and many more. We are head quartered at Delhi, with office at Pune, Mumbai, Chennai, Kolkata, Indore, Bangalore, Hyderabad, Ahemdabad, Raipur, Kanpur, Chandigarh.
Jobs
6
About the company
Jobs
2
About the company
Clink is reimagining restaurant growth — no commissions, no food bloggers, just AI-powered loyalty and real customer influence.
Our platform helps restaurants turn diners into repeat customers and brand advocates using smart rewards and Instagram-powered virality. With every visit, customers earn personalized rewards and post about their experience on instagram driving organic traffic, not paid ads.
If you're excited by AI, social growth, and building the future of hospitality tech — Clink is the place to be.
Jobs
1
About the company
India hosts 25M+ events every year, but ordering food is still a nightmare—unreliable vendors, inconsistent pricing, and no seamless way to book catering. We’re fixing this."
Craft My Plate is a vertically integrated food marketplace where customers can instantly customize menus, place orders, and get it delivered conveniently —without dealing with vendors.
Jobs
6
About the company
RockED is the premier people development platform for the automotive industry, supporting the entire employee lifecycle from pre-hire and onboarding to upskilling and career transitions. With microlearning content, gamified delivery, and real-time feedback, RockED is educating the automotive
workforce and solving the industry's greatest business challenges.
The RockED Company Inc. is headquartered in Florida. Backed by top industry experts and investors,
we’re a well-funded startup on an exciting growth journey. Our R&D team (Indian entity) is at the core
of all product and technology innovation
Our India R&D team, which is in Bangalore (Office in Church Street), leads all product and technology innovation, and we’re now expanding this team.
AI will play a key role in improving intelligence, personalization, and overall user experience across the platform.
Jobs
5
About the company
Jobs
4
About the company
Jobs
4
About the company
Jobs
1





