11+ XGBoost Jobs in Mumbai | XGBoost Job openings in Mumbai
Apply to 11+ XGBoost Jobs in Mumbai on CutShort.io. Explore the latest XGBoost Job opportunities across top companies like Google, Amazon & Adobe.
What You’ll Do
● Partner with Product to spot high-leverage ML opportunities tied to business
metrics.
● Wrangle large structured and unstructured datasets; build reliable features and
data contracts.
● Build and ship models to:
○ Enhance customer experiences and personalization
○ Boost revenue via pricing/discount optimization
○ Power user-to-user discovery and ranking (matchmaking at scale)
○ Detect and block fraud/risk in real time
○ Score conversion/churn/acceptance propensity for targeted actions
● Collaborate with Engineering to productionize via APIs/CI/CD/Docker on AWS.
● Design and run A/B tests with guardrails.
● Build monitoring for model/data drift and business KPIs
What We’re Looking For
● 2–4 years of DS/ML experience in consumer internet / B2C products, with 7–8 models shipped to production end-to-end.
● Proven, hands-on success in at least two (preferably 3–4) of the following:
○ Recommender systems (retrieval + ranking, NDCG/Recall, online lift;
bandits a plus)
○ Fraud/risk detection (severe class imbalance, PR-AUC)
○ Pricing models (elasticity, demand curves, margin vs. win-rate trade-offs,
guardrails/simulation)
○ Propensity models (payment/churn)
● Programming: strong Python and SQL; solid git, Docker, CI/CD.
● Cloud and data: experience with AWS or GCP; familiarity with
warehouses/dashboards (Redshift/BigQuery, Looker/Tableau).
● ML breadth: recommender systems, NLP or user profiling, anomaly detection.
● Communication: clear storytelling with data; can align stakeholders and drive decisions.
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
- Job Description below :
- Required Skill Set
- Software Engineer/Senior Software Engineer/Lead Engineer-Java
- Experience in Core Java 5.0 and above, Data Structures, OOPS, Multithreading, Algorithms, Collections, Unix/Linux
- Possess good architectural knowledge and be aware of enterprise application design patterns.
- Should have the ability to analyse, design, develop and test complex, low-latency client-facing applications.
- Good development experience with RDBMS
- Good knowledge of multi-threading and high-volume server side development
- Basic working knowledge of Unix/Linux
- Excellent problem solving and coding skills in Java
- Strong interpersonal, communication and analytical skills.
- Should have the ability to express their design ideas and thoughts.
Job Brief-
· Understand product requirements and come up with solution approaches
· Build and enhance large scale domain centric applications
· Deploy high quality deliverables into production adhering to the security, compliance and SDLC guidelines
About the company
KPMG International Limited, commonly known as KPMG, is one of the largest professional services networks in the world, recognized as one of the "Big Four" accounting firms alongside Deloitte, PricewaterhouseCoopers (PwC), and Ernst & Young (EY). KPMG provides a comprehensive range of professional services primarily focused on three core areas: Audit and Assurance, Tax Services, and Advisory Services. Their Audit and Assurance services include financial statement audits, regulatory audits, and other assurance services. The Tax Services cover various aspects such as corporate tax, indirect tax, international tax, and transfer pricing. Meanwhile, their Advisory Services encompass management consulting, risk consulting, deal advisory, and other related services.
Form link for quicker response:https://forms.gle/HdQPqyWCirDUEgMv5
Job Description
Positions: Chief Compliance Officer (CCO)/ Money Laundering Reporting Officer (MLRO)
Education Qualification:
- Degree in finance, accounting, business administration, economics, law, or criminology
- Certified Anti-Money Laundering Specialist (CAMS), Certified Fraud Examiner (CFE), or Certified Compliance Officer (CCO)
Experience: 12-14 years
Location: Pan India with potential requirement to travel to the middle east
Employment Type: contract for 6-12 months (Hybrid)
Responsibilities:
Regulatory Compliance:
- Ensure adherence to QFCRA regulations and other applicable laws. Keep policies updated with regulatory changes.
- Compliance Program: Design and maintain the firm’s compliance program, covering operational, legal, and risk requirements.
- Risk Management: Conduct risk assessments and develop strategies to mitigate compliance risks.
- Training: Provide ongoing compliance training to staff and senior management.
- Reporting & Auditing: Conduct internal audits and report findings to management and regulators.
- Liaison: Act as the main contact with QFCRA and other regulatory bodies.
- Incident Management: Investigate compliance breaches and take corrective actions.
- Provide support for cross border activities including jurisdictions where marketing materials and funds are being distributed
Money Laundering:
- AML Program: Implement and manage the firm’s Anti-Money Laundering and Anti-Terrorist Financing program.
- KYC & Monitoring: Oversee KYC and ongoing transaction monitoring, ensuring compliance with AML rules.
- Suspicious Activity Reporting: Identify and report suspicious transactions
- Staff Training: Ensure regular AML/CTF training for employees.
- Regulatory Liaison: Communicate with QFCRA on all AML/CTF-related matters.
- Audit & Testing: Conduct regular AML compliance audits and implement corrective measures where necessary.
Aadrila Technologies Pvt Ltd. is the largest data, analytics, automation, and decisioning solution provider to financial intuitions, catering to the entire customer lifecycle from onboarding to diligence. Aadrila Technologies provides solutions that enable systemic fraud prevention, risk management, compliance, and automation through superior data engineering and deep tech (NLP and Deep Learning) applications. In the B2B SaaS market, Aadrila Technologies Pvt. Ltd. is an undisputed leader. Based in Mumbai, Andheri, it has ~90% market share in the motor insurance underwriting data providers market.
Senior Software Engineer 4-6yrs of work exp
As a Senior Software Engineer at AADRILA TECHNOLOGIES PVT LTD, you will be responsible for designing, developing, and deploying high-quality API solutions. You will collaborate with cross-functional teams to build innovative and performant applications.
Responsibilities:
- Write reusable, testable, and efficient code.
- Design and implement low-latency, high-availability, and performant RESTful APIs.
- Implement security and data protection measures.
- Integrate data storage solutions into software systems.
- Develop, test, tune for performance, and deploy web services.
- Collaborate with the product team to build innovative, robust, and user-friendly features
Requirements:
Must-Haves:
- Excellent analytical and problem-solving skills.
- Proven experience in the software development lifecycle.
- Solid coding experience in Python.
- Good understanding of Object-Oriented Concepts and Design Patterns.
- Knowledge of Amazon Web Services (AWS) APIs deployment and management.
- Experience working with Multithreading/Multiprocessing.
- Good working knowledge of programming in a Linux environment.
- Acquaintance with Web Stacks and RESTful APIs.
- Experience in best code deployment practices.
Highly Desired:
- Experience with NoSQL databases like MongoDB.
- Familiarity with AWS Serverless Stack (API Gateway, Lambda, SQS, Cloudwatch etc.)
- Familiarity with infrastructure as code tools (serverless and Terraform).
Experience and Qualification:
- Minimum of 3+ years of relevant experience.
- Bachelor's or Master's degree in Computer Science, Computer Engineering, or Information Technology.
Join our dynamic team and work in a challenging environment where you will have the opportunity to make a significant impact. AADRILA TECHNOLOGIES PVT LTD offers competitive compensation, professional development opportunities, and a collaborative work culture. Apply now and be part of our innovative software engineering team.
Engineering Manager/Technical Architect 10Yrs Plus
Skill Set: 10 years plus as a full stack Java/JavaScript Developer
Micro Services, Distributed Systems,
Cloud Services : AWS:(EC2,S3,Lambda,Load Balancing,Serverless)
Programming Backend : Node.js ,Spring Boot, Java
Programming FrontEnd :React.js,Angular
Queuing: RabbitMQ /Kafka
Methodologies: Agile Scrum
Responsibilities:
End to end coding ; from software architecture to managing scaling ,of high throughput(100000)RPS high volume transactions.
DIscuss business requirements and timelines with management and create a task plan for junior members
Manage the day to day activities of all team members and report their work progress
Mentoring the team on best coding practices and making sure modules are Live
on time..
Management of security vulnerabilities
Be a full individual contributor which means can work in a team as well as alone
Attitude:
Passion for tech innovation and solve problems
GoGetter Attitude
Extremely humble and polite
Experience in Product COmpanies and Managing small teams is a plus
Who are We??
Billion Bonds Finance
We are a new age Finance company.We are developing the next generation digital only bank focussed towards Gen Z..!Are you the one who would like to Lead the Core team for changing the boring User Exprience in Banking?
Hello,
As discussed below is the JD of GC/BDE Intern/BDE/Sr,BDE/BDM- Inside Sales.
Company Name- uFaber Edutech Pvt Ltd.
Shift Timing/ Day- 10:30am-7:30pm/ Mon-Sat working.
Website: www.ufaber.com
Who are we-
uFaber is a well-funded Edutech startup, founded by serial entrepreneurs from IIT Bombay to change the way we learn. We sell high-quality online courses on a variety of topics, from exam preparation to certifications.
Key Responsibilities
- Write clean, testable code using .NET programming languages
- Test and debug various .NET applications
- Deploy fully functional applications
- Upgrade existing applications
- Document development and operational procedures
Technical Competencies
- Understanding of object oriented programming concepts, design patterns and SOLID principles
- Must have hands on experience with C#, ASP.NET, MVC, Windows Services, .NET Entity Framework, SQL Server/T-SQL, IIS and related technologies
- Understanding of REST APIs and WCF
- Understanding of modern techniques such as Responsive Web Design, LINQ, Dependency Injection and Design Patterns
- Must have hands on experience of web client technologies inclusive of HTML5, CSS3, JavaScript, AJAX, JSON and JQuery
- Knowledge of various industry standard development practices (Agile methodology, OOD, TDD, BDD)
- Must have hands on experience of tools to enable SDLC (TFS, Visual Studio, etc.
If interested please share your profile on below link-
- https://smrtr.io/bqCWb" target="_blank">https://smrtr.io/bqCWb





