
Tata Consultancy Services
http://www.tcs.comJobs at Tata Consultancy Services


Experience in working with various ML libraries and packages like Scikit learn, Numpy, Pandas, Tensorflow, Matplotlib, Caffe, etc. 2. Deep Learning Frameworks: PyTorch, spaCy, Keras 3. Deep Learning Architectures: LSTM, CNN, Self-Attention and Transformers 4. Experience in working with Image processing, computer vision is must 5. Designing data science applications, Large Language Models(LLM) , Generative Pre-trained Transformers (GPT), generative AI techniques, Natural Language Processing (NLP), machine learning techniques, Python, Jupyter Notebook, common data science packages (tensorflow, scikit-learn,keras etc.,.) , LangChain, Flask, FastAPI, prompt engineering. 6. Programming experience in Python 7. Strong written and verbal communications 8. Excellent interpersonal and collaboration skills.
Good-to-Have 1. Experience working with vectored databases and graph representation of documents. 2. Experience with building or maintaining MLOps pipelines. 3. Experience in Cloud computing infrastructures like AWS Sagemaker or Azure ML for implementing ML solutions is preferred. 4. Exposure to Docker, Kubernetes
SN Role descriptions / Expectations from the Role
1 Design and implement scalable and efficient data architectures to support generative AI workflows.
2 Fine tune and optimize large language models (LLM) for generative AI, conduct performance evaluation and benchmarking for LLMs and machine learning models
3 Apply prompt engineer techniques as required by the use case
4 Collaborate with research and development teams to build large language models for generative AI use cases, plan and breakdown of larger data science tasks to lower-level tasks
5 Lead junior data engineers on tasks such as design data pipelines, dataset creation, and deployment, use data visualization tools, machine learning techniques, natural language processing , feature engineering, deep learning , statistical modelling as required by the use case.
Upgrade/Migrate DB2 Versions to newer versions (Currently using v12) · Regularly applying PTFS/APARS if required. · Manage DB2 IVS Products and upgrade to supported versions if required. · Managing system catalogs · DB2 administration on z/OS, SQL creation, DB2 command line creation and execution, DB2 performance tuning, · Physical database definition (DDL) and understanding of logical database design · Monitoring – threads, table space and index spaces. · Monitoring instances for events and resolution/escalations · Performance tuning – Database and SQL tuning · DB2 recycle · DB2 Image copy (full/incremental) · Database creation · Grant/revoke of authorities · DB2 archive logs · Having experience of DR setup through various synch methods · Build and maintenance of database and related objects like table space, tables, indexes, views, triggers etc. · Online utilities -CHECKDATA, COPY, QUIESCE, REPAIR, REORG, LOAD, UNLOAD etc
Mainframe Developer Job Description
A Mainframe Developer is responsible for designing, developing, and maintaining software applications on mainframe systems. Here's a brief overview:
Key Responsibilities
- Application Development: Develop and maintain mainframe applications using languages like COBOL, PL/1, or Assembler.
- Code Maintenance: Perform code reviews, debugging, and optimization to ensure high-quality code.
- System Integration: Integrate mainframe applications with other systems and technologies.
- Testing and Quality Assurance: Test and validate mainframe applications to ensure they meet business requirements.
- Troubleshooting: Troubleshoot mainframe issues and resolve technical problems.
Technical Skills
- Mainframe Systems: Strong understanding of mainframe systems, including z/OS, z/VM, or z/VSE.
- Programming Languages: Proficiency in mainframe programming languages like COBOL, PL/1, or Assembler.
- Database Management: Knowledge of mainframe database management systems like DB2 or IMS.
- Job Scheduling: Familiarity with job scheduling tools like CA-7 or Control-M.
- Security: Understanding of mainframe security concepts and best practices.

Hands-on knowledge in machine learning, deep learning, TensorFlow, Python, NLP 2. Stay up to date on the latest AI emergences relevant to the business domain. 3. Conduct research and development processes for AI strategies. 4. Experience developing and implementing generative AI models, with a strong understanding of deep learning techniques such as GPT, VAE, and GANs. 5. Experience with transformer models such as BERT, GPT, RoBERTa, etc, and a solid understanding of their underlying principles is a plus
Good-to-Have 1. Have knowledge of software development methodologies, such as Agile or Scrum 2. Have strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience. 3. Have experience with natural language processing (NLP) techniques and tools, such as SpaCy, NLTK, or Hugging Face 4. Ensure the quality of code and applications through testing, peer review, and code analysis. 5. Root cause analysis and bugs correction 6. Familiarity with version control systems, preferably Git. 7. Experience with building or maintaining cloud-native applications. 8. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud is Plus
SN Role descriptions / Expectations from the Role
1 Design, Develop and configure GenAI applications to meet the business requirements.
2 Optimizing existing generative AI models for improved performance, scalability, and efficiency
3 Developing and maintaining AI pipelines, including data preprocessing, feature extraction, model training, and evaluation
- Work with the team in capacity of GCP Data Engineer on day-to-day activities.
- Solve problems at hand with utmost clarity and speed.
- Work with Data analysts and architects to help them solve any specific issues with tooling/processes.
- Design, build and operationalize large scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Python/Java/React.js, AirFlow ETL skills - GCP services (BigQuery, Dataflow, Cloud SQL, Cloud Functions, Data Lake.
- Design and build production data pipelines from ingestion to consumption within a big data architecture.
- GCP BQ modeling and performance tuning techniques.
- RDBMS and No-SQL database experience.
- Knowledge on orchestrating workloads on cloud.
Very good understanding of Azure Monitor, Log Analytics, Alerting, Dashboards & Workbooks.
Good working knowledge of Kusto Query Language (KQL) or Python for data analysis and querying.
Good working knowledge of Azure Functions for automation.
Apply Powershell programming skills to develop clean code that is stable, consistent, and well[1]performing.
Knowledge of Terraform, ARM or Bicep for infrastructure as code.
GitHub/Gitlab for devops pipelines.
Experience in monitoring infrastructure, managing escalations, and ensuring timely issue resolution to maintain operational stability and stakeholder satisfaction.
Good-to-Have
Knowledge of Terraform, ARM or Bicep for infrastructure as code
Teradata Developer Job Description
A Teradata Developer is responsible for designing, developing, and implementing data warehousing solutions using Teradata. Here's a brief overview:
Key Responsibilities
- Data Warehousing: Design and develop data warehousing solutions using Teradata.
- ETL Development: Develop ETL (Extract, Transform, Load) processes to load data into Teradata.
- SQL Development: Write complex SQL queries to support business intelligence and reporting.
- Performance Optimization: Optimize Teradata database performance for fast query execution.
- Data Modeling: Design and implement data models to support business requirements.
Technical Skills
- Teradata: Strong understanding of Teradata database management system.
- SQL: Proficiency in writing complex SQL queries.
- ETL: Experience with ETL tools like Informatica PowerCenter or Teradata PT.
- Data Warehousing: Knowledge of data warehousing concepts and best practices.
- Data Modeling: Understanding of data modeling concepts and techniques.
Responsibility of / Expectations from the Role
1 Driving the Informatica changes and supporting production issues.
2 Customer interaction as required
3 Requirement gathering and analysis
4 Resolution of Server related issues, GUI related issues
5 Taking the end to end ownership of Informatica changes.
Role descriptions / Expectations from the Role
Develop, deploy, and maintain Power Platform solutions, including canvas apps, model-driven apps, and flows.
Collaborate with cross-functional teams to define, design, and ship new features.
Troubleshoot and resolve issues related to Power Platform applications.
Ensure the security, scalability, and reliability of Power Platform solutions.
You will be joining a fast-growing product team within the End User Technology Services organisation. You will be part of the build out of the capability in the region and you will liaise with our customers across all the LEADING BANK business lines
DataStage Developer Job Description
A DataStage Developer is responsible for designing, developing, and implementing data integration solutions using IBM InfoSphere DataStage. Here's a brief overview:
Key Responsibilities
- Data Integration: Design and develop data integration jobs using DataStage to extract, transform, and load (ETL) data from various sources.
- Job Development: Develop and test DataStage jobs to meet business requirements.
- Data Transformation: Use DataStage transformations to cleanse, aggregate, and transform data.
- Performance Optimization: Optimize DataStage jobs for performance and scalability.
- Troubleshooting: Troubleshoot DataStage issues and resolve data integration problems.
Technical Skills
- DataStage: Strong understanding of IBM InfoSphere DataStage and its components.
- ETL: Experience with ETL concepts and data integration best practices.
- Data Transformation: Knowledge of data transformation techniques and DataStage transformations.
- SQL: Familiarity with SQL and database concepts.
- Data Modeling: Understanding of data modeling concepts and data warehousing.

Similar companies
About the company
Jobs
6
About the company
At Torero Softwares Ltd, we build next-gen ERP solutions that power businesses in healthcare, pharma, FMCG, distribution, and retail. With 25+ years of expertise and a 3,500+ client base, our flagship product, Medica Ultimate™, helps companies streamline operations, boost efficiency, and stay compliant.
Why Join Us?
🚀 Fast-Growing Tech Company – Work on industry-leading SaaS & ERP solutions
💡 Innovation-Driven – Be part of a team solving real-world business challenges
📈 Career Acceleration – Hands-on learning, mentorship & growth opportunities
📚 Collaborative Culture – Work alongside tech experts in a dynamic environment
Whether in sales, implementation, or customer success, you'll help transform businesses with technology.
📍 Location: Lower Parel, Mumbai
📩 Contact: Simran Jain | ✉ [email protected] | 📞 9702074236 (Call/WhatsApp)
Jobs
3
About the company
Jobs
16
About the company
Jobs
3
About the company
Jobs
4
About the company
HelloRamp.ai is an AI-powered dealership management system specifically designed for used vehicle dealers. We specialize in transforming how used cars are sold online by providing professional studio-quality images and virtual experiences using 3D technology and Generative AI.
We are revolutionizing how dealerships showcase their vehicles with cutting-edge AI-powered tools. Our Virtual Photo Booth enables dealerships to create professional-quality images with 4K clarity and immersive 360° interactive walkthroughs—all with just a mobile app, eliminating the need for a studio. Seamlessly integrated with DMS/IMS platforms, HelloRamp.ai offers features like damage tagging, luxury backgrounds, and custom viewers with AI-Powered Car Transformation and Integration with major automotive platforms.
Designed to boost engagement, enhance online presence, and streamline sales processes, HelloRamp.ai helps dealerships attract more leads, close deals faster, and optimize operations effectively.
HelloRamp is a part of HelloAR, found in 2017, a one-stop AR/ VR solution company.
Jobs
1
About the company
OneSpider Technologies LLP is a leading provider of software and mobile application solutions for the Pharma and FMCG sector. Our products help distributors and retailers streamline operations.
Jobs
4
About the company
Automate Accounts is a technology-driven company dedicated to building intelligent automation solutions that streamline business operations and boost efficiency. We leverage modern platforms and tools to help businesses transform their workflows with cutting-edge solutions.
Jobs
1
About the company
VDart - We are a global, emerging technology staffing solutions provider with expertise in SMAC (Social, Mobile, Analytics & Cloud), Enterprise Resource Planning (Oracle Applications, SAP), Business Intelligence (Hyperion), and Infrastructure services. We work with leading System Integrators in, the private and public sectors. We have deep industry expertise and focus in BFSI, Manufacturing, Energy & Utility, Healthcare and Technology sector. Our scope, knowledge, industry expertise, and global footprint have enabled us to provide best in the industry solutions. With our Core focus in emerging technologies, we have provided global technology workforce solutions in USA, Canada, Mexico, Brazil, UK, Australia & India. We take pride in delivering specialized talent, superior performance, and seamless execution to meet the challenging business needs of customers worldwide.
Jobs
8
About the company
Jobs
1