11+ ICS Jobs in Chennai | ICS Job openings in Chennai
Apply to 11+ ICS Jobs in Chennai on CutShort.io. Explore the latest ICS Job opportunities across top companies like Google, Amazon & Adobe.
JD –
Having experience in Implementing Integration Solutions using Oracle Integration Cloud Service.
Developed integration between SaaS application (Oracle Cloud ERP, Oracle Cloud HCM) and between SaaS and PaaS application
Should have worked extensively on minimum 3-4 Technology Adapters like File, Database, Oracle ERP & FTP adapter
Should have excellent skill in Web Service technologies such as XML, XPath, XSLT, SOAP, WSDL, and XSD
Experience in all phases of software development lifecycle from gathering requirements to documentation, testing, implementation and support Ability to troubleshoot technical and configuration issues
Should be able to communicate effectively with the functional & technical groups and various technical team members.
Ensure completion of tasks, milestones, and components including Technical specifications, design specifications, configurations, quality assurance, implementations, and project reviews
Role Overview:
We are looking for a skilled Data Scientist with expertise in data analytics, machine learning, and AI to join our team. The ideal candidate will have a strong command of data tools, programming, and knowledge of LLMs and Generative AI, contributing to the growth and automation of our business processes.
Key Responsibilities:
- Data Analysis & Visualization:
- Develop and manage data pipelines, ensuring data accuracy and integrity.
- Design and implement insightful dashboards using Power BI to help stakeholders make data-driven decisions.
- Extract and analyze complex data sets using SQL to generate actionable insights
2 Machine Learning & AI Models:
- Build and deploy machine learning models to optimize key business functions like discount management, lead qualification, and process automation.
- Apply Natural Language Processing (NLP) techniques for text extraction, analysis, and classification from customer documents.
- Implement and fine-tune Generative AI models and large language models (LLMs) for various business applications, including prompt engineering for automation tasks.
3 Automation & Innovation:
- Use AI to streamline document verification, data extraction, and customer interaction processes.
- Innovate and automate manual processes, creating AI-driven solutions for internal teams and customer-facing systems.
- Stay abreast of the latest advancements in machine learning, NLP, and generative AI, applying them to real-world business challenges.
Qualifications:
- Bachelor's or Master’s degree in Computer Science, Data Science, Statistics, or related field.
- 4-7 years of experience as a Data Scientist, with proficiency in Python, SQL, Power BI, and Excel.
- Expertise in building machine learning models and utilizing NLP techniques for text processing and automation.
- Experience in working with large language models (LLMs) and generative AI to create efficient and scalable solutions.
- Strong problem-solving skills, with the ability to work independently and in teams.
- Excellent communication skills, with the ability to present complex data in a simple, actionable way to non-technical stakeholders.
If you’re excited about leveraging data and AI to solve real-world problems, we’d love to have you on our team!

a global provider of Business Process Management and Outsourcing solutions company
Appian Developer / Sr Appian Developer
· Extensive experience in Appian BPM application development
· Knowledge of Appian architecture and its objects best practices
· Participate in analysis, design, and new development of Appian based applications
· Team leadership and provide technical leadership to Scrum teams
· Must be able to multi-task, work in a fast-paced environment and ability to resolve problems faced
by team
· Build applications: interfaces, process flows, expressions, data types, sites, integrations, etc.
· Proficient with SQL queries and with accessing data present in DB tables and views
· Experience in Analysis, Designing process models, Records, Reports, SAIL, forms, gateways, smart
services, integration services and web services
· Experience working with different Appian Object types, query rules, constant rules and expression
rules
Qualifications
· At least 6 years of experience in Implementing BPM solutions using Appian 19.x or higher
· Over 8 years in Implementing IT solutions using BPM or integration technologies
· Certification Mandatory- L1 and L2 a
· Experience in Scrum/Agile methodologies with Enterprise level application development projects
· Good understanding of database concepts and strong working knowledge any one of the major
databases e g Oracle SQL Server MySQL
Additional information
Skills Required
· Appian BPM application development on version 19.x or higher
· Experience of integrations using web services e g XML REST WSDL SOAP API JDBC JMS
· Good leadership skills and the ability to lead a team of software engineers technically
· Experience working in Agile Scrum teams
· Good Communication skills
Skills: Machine Learning,Deep Learning,Artificial Intelligence,python.
Location:Chennai
Domain knowledge: Data cleaning, modelling, analytics, statistics, machine learning, AI
Requirements:
· To be part of Digital Manufacturing and Industrie 4.0 projects across Saint Gobain group of companies
· Design and develop AI//ML models to be deployed across SG factories
· Knowledge on Hadoop, Apache Spark, MapReduce, Scala, Python programming, SQL and NoSQL databases is required
· Should be strong in statistics, data analysis, data modelling, machine learning techniques and Neural Networks
· Prior experience in developing AI and ML models is required
· Experience with data from the Manufacturing Industry would be a plus
Roles and Responsibilities:
· Develop AI and ML models for the Manufacturing Industry with a focus on Energy, Asset Performance Optimization and Logistics
· Multitasking, good communication necessary
· Entrepreneurial attitude.
- Hands-on experience with
- EJS- express.Js,
- MVC, MVVM, full stack MVC,
- GITHUB,
- Rest API.
- Should have strong DB Knowledge.
- Should have good knowledge of Version Control.
- Ability to Re-structure
- Hands-on experience in backend architecture.
- Ability to lead a team.
- Should be a self-starter
Skills: to UI Automation, Selenium, JAVA Rest Assured, TestNG, BDD, Cucumber, GIT, Jenkins
Requirments:
- Hands on of working across multiple projects
- Hands-on knowledge of BDD Cucumber Framework
- Hands-on knowledge of Maven
- Hands-on knowledge of Selenium - Appium
- Hands-on knowledge of Jenkins
- Basic Knowledge about CI/CD Concepts
- Exposure to UI Automation, Selenium, JAVA Rest Assured, TestNG
- Experience using at least one defect management tool – JIRA, HP ALM, ADO etc.
- Immediate joiners with 5 to 10 years of experience.
- Should have team leading experience.
- Should be keen to work as a Developer.
- Java, Spring boot and Design patterns are key areas where they should be excellent.
- Good communication skills is a mandate.
- Should be willing to work on alternate Saturdays (10 AM to 4:30 PM).
- They will have to relocate to Chennai.
- Strong SQL skills, Postgres SQL database knowledge.
- Cloud Experience in deployment (CI/CD)
- Unit Test case
- Angular – good to have
Job Responsibilities
- Design, build & test ETL processes using Python & SQL for the corporate data warehouse
- Inform, influence, support, and execute our product decisions
- Maintain advertising data integrity by working closely with R&D to organize and store data in a format that provides accurate data and allows the business to quickly identify issues.
- Evaluate and prototype new technologies in the area of data processing
- Think quickly, communicate clearly and work collaboratively with product, data, engineering, QA and operations teams
- High energy level, strong team player and good work ethic
- Data analysis, understanding of business requirements and translation into logical pipelines & processes
- Identification, analysis & resolution of production & development bugs
- Support the release process including completing & reviewing documentation
- Configure data mappings & transformations to orchestrate data integration & validation
- Provide subject matter expertise
- Document solutions, tools & processes
- Create & support test plans with hands-on testing
- Peer reviews of work developed by other data engineers within the team
- Establish good working relationships & communication channels with relevant departments
Skills and Qualifications we look for
- University degree 2.1 or higher (or equivalent) in a relevant subject. Master’s degree in any data subject will be a strong advantage.
- 4 - 6 years experience with data engineering.
- Strong coding ability and software development experience in Python.
- Strong hands-on experience with SQL and Data Processing.
- Google cloud platform (Cloud composer, Dataflow, Cloud function, Bigquery, Cloud storage, dataproc)
- Good working experience in any one of the ETL tools (Airflow would be preferable).
- Should possess strong analytical and problem solving skills.
- Good to have skills - Apache pyspark, CircleCI, Terraform
- Motivated, self-directed, able to work with ambiguity and interested in emerging technologies, agile and collaborative processes.
- Understanding & experience of agile / scrum delivery methodology
Responsibilities
Work with development teams and product managers to ideate software solutions
Design client-side and server-side architecture
Build the front-end of applications through appealing visual design
Develop and manage well-functioning databases and applications
Write effective APIs
Test software to ensure responsiveness and efficiency
Troubleshoot, debug and upgrade software
Create security and data protection settings
Build features and applications with a mobile responsive design
Write technical documentation
Work with data scientists and analysts to improve software
Requirements And Skills
Proven experience as a Full Stack Developer or similar role
Experience developing desktop and mobile applications
Familiarity with common stacks
Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery)
Knowledge of multiple back-end languages (e.g. C#, Java, Python) and JavaScript frameworks (e.g. Angular, React, Node.js)
Familiarity with databases (e.g. MySQL, MongoDB), web servers (e.g. Apache), and UI/UX design
Excellent communication and teamwork skills
Great attention to detail
Organizational skills
An analytical mind
Skills:- MongoDB, Express, HTML/CSS, Javascript, XML, jQuery, Java, MySQL
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools: Experience with big
- data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.


