

Exp-10 Years
Location Mumbai
Ready to work from Office
Permanent role
Role
Business Analytics/Data Analytics Lead
- Enable Business Transformation with Data Analytics by working with our big clients in major sectors such as Oil & Gas, Manufacturing, Retail, BFSI
- Partner business teams to understand the business area, work with the team to evolve and define the business problems, use-cases, translate the same into an analytical problem, design solution approach, define ROI, help execute to deliver relevant insights that lead to actions and business value.
- Connect business requirements with Technology solutions and be able to determine feasibility, technical gaps, size etc.
- Be able to define a project plan for the identified solutions, help hire and train technical resources, be the glue between Business team and the technical/implementation team
- Be proficient at solving ad hoc business problems, innovating on potential future solutions, and working with technology partners to build solutions and/or creative approaches to solve ‘one of a kind’ problems.
- Embed tested solutions to make them business as usual and help quantify value benefits from analytics.
- Be creative and a committed problem-solver using Data and Advanced analytics skill alongside leadership and direct engagement of business leaders.
- Be a trusted advisor to the client CxOs/leadership team in evangelising the use of Data & Data Science in business decisions. Help accelerate the organisation journey to become data driven.
- Provide input during development and implementation phases, including formulation and definition of systems scope, objectives and necessary system enhancements for complex, high-impact projects
- Work with development and testing teams to deliver solutions and support UAT
- Identify and communicate risks and impacts and propose risk mitigation options, considering business implications of the application of technology to the current business environment.
Qualifications
- 10+ years experience in technology/project management/product management/solution delivery
- 4+ years of relevant work experience in roles partnering in Data Analytics related use-cases and solutions resulting in significant cost, productivity and/or quality gains for the business
- Bachelors degree in a relevant technical field (Computer Science, IT, Statistics, Business Analytics or similar). MBA preferred
· Experience of specifying and writing business requirements and suggesting appropriate solutions in the field of Data Analytics/AI
- Prior industry experience in defining/creating Data Analytics solutions in one or more of Oil & Gas, BFSI, Manufacturing, Retail sectors.
- Strong preference for experience in setting up & managing analytics / data products / platforms & leading change management.
- Be able to articulate and have a keen eye on the value that Data Analytics, AI, Machine Learning, Data Lake/Catalogue can bring to organizations
- Ability to communicate complex quantitative insights in a precise, and actionable manner
- Strong communication skills and ability to work with peers & senior stakeholders demonstrating strong vertical and lateral influence.
- Working knowledge of Analytics tools such as PowerBI, Tableau, AI modelling such as Regression, Classification, and Time Series Problems.
- Experience building insights from analytics & telling data stories
- Passion for empirical research and for answering hard questions with data

About TSG Global Services Private Limited
About
Connect with the team
Similar jobs


AccioJob is conducting an offline hiring drive with Gaian Solutions India for the position of AI /ML Intern.
Required Skills - Python,SQL, ML libraries like (scikit-learn, pandas, TensorFlow, etc.)
Apply Here - https://go.acciojob.com/tUxTdV
Eligibility -
- Degree: B.Tech/BE/BCA/MCA/M.Tech
- Graduation Year: 2023, 2024, and 2025
- Branch: All Branches
- Work Location: Hyderabad
Compensation -
- Internship stipend: 20- 25k
- Internship duration: 3 months
- CTC:- 4.5-6 LPA
Evaluation Process -
- Assessment at the AccioJob Skill Centre in Pune
- 2 Technical Interviews
Apply Here - https://go.acciojob.com/tUxTdV
Important: Please bring your laptop & earphones for the test.

💯What will you do?
- Create and conduct engaging and informative Data Science classes that incorporate real-world examples and hands-on activities to ensure student engagement and retention.
- Evaluate student projects to ensure they meet industry standards and provide personalised, constructive feedback to students to help them improve their skills and understanding.
- Conduct viva sessions to assess student understanding and comprehension of the course materials. You will evaluate each student's ability to apply the concepts they have learned in real-world scenarios and provide feedback on their performance.
- Conduct regular assessments to evaluate student progress, provide feedback to students, and identify areas for improvement in the curriculum.
- Stay up-to-date with industry developments, best practices, and trends in Data Science, and incorporate this knowledge into course materials and instruction.
- Work with the placements team to provide guidance and support to students as they navigate their job search, including resume and cover letter reviews, mock interviews, and career coaching.
- Train the TAs to take the doubt sessions and for project evaluations
💯Who are we looking for?
We are looking for someone who has:
- A minimum of 1-2 years of industry work experience in data science or a related field. Teaching experience is a plus.
- In-depth knowledge of various aspects of data science like Python, MYSQL, Power BI, Excel, Machine Learning with statistics, NLP, DL.
- Knowledge of AI tools like ChatGPT (latest versions as well), debugcode.ai, etc.
- Passion for teaching and a desire to impart practical knowledge to students.
- Excellent communication and interpersonal skills, with the ability to engage and motivate students of all levels.
- Experience with curriculum development, lesson planning, and instructional design is a plus.
- Familiarity with learning management systems (LMS) and digital teaching tools will be an added advantage.
- Ability to work independently and as part of a team in a fast-paced, dynamic environment.
💯What do we offer in return?
- Awesome colleagues & a great work environment - Internshala is known for its culture (see for yourself) and has twice been recognized as a Great Place To Work in the last 3 years
- A massive learning opportunity to be an early member of a new initiative and experience building it from scratch
- Competitive remuneration
💰 Compensation - Competitive remuneration based on your experience and skills
📅 Start date - Immediately
About the Role:
We are looking for an experienced and imaginative Generative AI Architect & Engineer to lead, design, and build cutting-edge solutions using Large Language Models (LLMs) such as OpenAI’s GPT-4/5, Google Gemini, Meta’s LLaMA, and Anthropic Claude. You will spearhead the development of a corporate-wide GenAI platform that harnesses proprietary enterprise knowledge and empowers sales, operations, and other departments with instant, contextual insights, guidance, and automation.
As the GenAI leader, you will be responsible for building scalable pipelines, integrating enterprise content repositories, fine-tuning or adapting foundational models, and creating secure and intuitive access patterns for business teams. You are passionate about creating highly usable solutions that solve real-world problems, and are fluent across architecture, implementation, and MLOps best practices.
Key Responsibilities:
- Architect and implement an end-to-end GenAI solution leveraging LLMs to serve as a contextual assistant across multiple business units.
- Develop pipelines to ingest, clean, and index enterprise knowledge (documents, wikis, CRM, chat transcripts, etc.) using RAG (Retrieval-Augmented Generation) patterns and vector databases.
- Lead fine-tuning, prompt engineering, and evaluation of LLMs, adapting open-source or commercial models to enterprise needs.
- Design a secure, scalable, API-first microservice platform, including middleware and access control, integrated into corporate systems.
- Work closely with sales, operations, and customer support teams to gather use cases and translate them into impactful GenAI features.
- Drive experimentation and benchmarking to evaluate various open and closed LLMs (OpenAI, Claude, Gemini, LLaMA, Mistral, etc.) for best performance and cost-efficiency.
- Collaborate with DevOps teams to enable MLOps workflows, CI/CD pipelines, versioning, and A/B testing for AI models.
- Contribute to technical documentation, best practices, and internal knowledge sharing.
Key Qualifications:
- 4–5+ years of hands-on experience in AI/ML product development or applied research.
- Demonstrated experience working with LLMs (OpenAI, LLaMA, Claude, Gemini, Mistral, etc.) and RAG pipelines, including vector search (FAISS, Weaviate, Pinecone, Chroma, etc.).
- Strong Python skills and experience with frameworks such as LangChain, LlamaIndex, Transformers, Ray, HuggingFace, or equivalent.
- Deep understanding of NLP, model fine-tuning, embeddings, tokenization, and content ingestion pipelines.
- Exposure to enterprise content systems (e.g., SharePoint, Confluence, Salesforce, internal wikis, etc.) and integrating with them securely.
- Solid foundation in software architecture, microservices, API design, and cloud deployments (Azure, AWS, or GCP).
- Experience with security, RBAC, and compliance practices in enterprise-grade solutions.
- Ability to lead projects independently and mentor junior engineers or data scientists.
How to Apply:
Submit your resume and a short technical project summary or portfolio (GitHub, Hugging Face, blog posts)
● Strong in data structure and algorithms
● Experience in Node.js, Express, API Design & DOM
● Understanding of component based design or other design patterns
● Experience with unit testing, integration testing & continuous integration
● RDBMS and NoSQL databases preferably PostgreSQL, MongoDB
● Good to have passion for investing
Position Responsibilities
● Be honest, reliable & consistent
● Write efficient & clean code
● Have a strong sense of ownership
● Be a part of development & maintenance of our company web app, Operations dashboard and
other 3rd party products we own
● Contribute to improving the quality of engineering process & engineering culture
Job Description :
Handle Customer queries & Outbound/Inbound Calls with a team
-Convert the lead to prospect & pass it on to Sales.
-Daily updating of leads on the CRM.
Assisted buying experience at the site.
-Energetic, self-motivated, able to work independently under pressure.
-Fluent in English , Marathi , Hindi communication.
- 3+ years of relevant experience
- 2+ years experience with AWS (EC2, ECS, RDS, Elastic Cache, etc)
- Well versed with maintaining infrastructure as code (Terraform, Cloudformation, etc)
- Experience in setting CI/CD pipelines from scratch
- Knowledge of setting up and securing networks (VPN, Intranet, VPC, Peering, etc)
- Understanding of common security issues
Job Profile
Purpose of Role: To support and maintain the existing and future applications managed by ABD (Agile Business Development) Team.
Main Responsibilities:
- To work on existing and future applications as required
- Work with ABD India Application Support and Business Support team to address production issues and/or to do development work
- Work with US ABD team to coordinate the on-going support and monthly release of C-Hub (Customer-Hub) application and other ABD Applications
- Report status to management on project progress
- Identify and escalate related issues and risks
- Coordinate with others in Development and Business Support team to implement new features of SQL Server/ MS Web based technologies
- Assess current modules and offer performance improvements techniques
- Make sure Database Design Standards/coding standards established by ABD are followed
- Constant communication with ABD leadership team in US.
- Problem Solving ability with “Can-do” attitude
- IT experience in Banking industry is preferred
Candidate Profile:
- Minimum 10 years of experience at least 3-5 years of experience in Corporate & Investment Banking area and Bachelor's degree in Engineering, Accounting, Finance, or equivalent
- Expertise with MS SQL Server development including SSIS and SSRS, Proficient in development of Stored procedure, Views, Functions etc in MS SQL Server
- Experienced with DBA kind tasks such as replication, linked server, query analyzer, DBCC commands, data transfer from one DB server to other DB Server etc
- Experienced in MS Excel VBA and macro scripting
- Knowledge of data Encryption in database

* Work with the product managers and designers and co-own our consumer app (1M+ userbase)
* Own our admin dashboard for all the product offerings - digital gold, gold loan, & gold locker
* Architect, design, and maintain frontend libraries for both our consumer application and admin dashboard
* Mentor team of 3 - 4 frontend developers to build a robust, lightweight, and high-performance client-side app
* Translating designs and wireframes into high quality Typescript code
* Write documentation and guides for consumer app & admin dashboard
Key Qualifications
* Expertise in ReactJS/Redux
* Expert-level knowledge in TypeScript or Flow
* PURE experience of 3- 5 Years and more in Frontend Development
* Expert-level knowledge of developing, shipping, and maintaining Javascript applications
* Knowledge of general software design patterns
* Good understanding of CSR and SSR
* Deep understanding of Javascript
* Up-to-date on the latest build tools, and libraries such as ES6, Webpack, Babel
* Proficient in Javascript with strong object-oriented design skills
* Able to work independently and drive results
Bonus
* Previous work experience in product-based (B2B/B2C) / fintech startups
* Contributed/maintained to an open source library

WHAT YOU WILL DO:
-
● Create and maintain optimal data pipeline architecture.
-
● Assemble large, complex data sets that meet functional / non-functional business requirements.
-
● Identify, design, and implement internal process improvements: automating manual processes,
optimizing data delivery, re-designing infrastructure for greater scalability, etc.
-
● Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide
variety of data sources using Spark,Hadoop and AWS 'big data' technologies.(EC2, EMR, S3, Athena).
-
● Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition,
operational efficiency and other key business performance metrics.
-
● Work with stakeholders including the Executive, Product, Data and Design teams to assist with
data-related technical issues and support their data infrastructure needs.
-
● Keep our data separated and secure across national boundaries through multiple data centers and AWS
regions.
-
● Create data tools for analytics and data scientist team members that assist them in building and
optimizing our product into an innovative industry leader.
-
● Work with data and analytics experts to strive for greater functionality in our data systems.
REQUIRED SKILLS & QUALIFICATIONS:
-
● 5+ years of experience in a Data Engineer role.
-
● Advanced working SQL knowledge and experience working with relational databases, query authoring
(SQL) as well as working familiarity with a variety of databases.
-
● Experience building and optimizing 'big data' data pipelines, architectures and data sets.
-
● Experience performing root cause analysis on internal and external data and processes to answer
specific business questions and identify opportunities for improvement.
-
● Strong analytic skills related to working with unstructured datasets.
-
● Build processes supporting data transformation, data structures, metadata, dependency and workload
management.
-
● A successful history of manipulating, processing and extracting value from large disconnected datasets.
-
● Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores.
-
● Strong project management and organizational skills.
-
● Experience supporting and working with cross-functional teams in a dynamic environment
-
● Experience with big data tools: Hadoop, Spark, Pig, Vetica, etc.
-
● Experience with AWS cloud services: EC2, EMR, S3, Athena
-
● Experience with Linux
-
● Experience with object-oriented/object function scripting languages: Python, Java, Shell, Scala, etc.
PREFERRED SKILLS & QUALIFICATIONS:
● Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.


