
Infinium Associate
https://infiniumassociates.comAbout
Connect with the team
Jobs at Infinium Associate
We are looking for a visionary and hands-on Head of Data Science and AI with at least 6 years of experience to lead our data strategy and analytics initiatives. In this pivotal role, you will take full ownership of the end-to-end technology stack, driving a data-analytics-driven business roadmap that delivers tangible ROI. You will not only guide high-level strategy but also remain hands-on in model design and deployment, ensuring our data capabilities directly empower executive decision-making.
If you are passionate about leveraging AI and Data to transform financial services, we invite you to lead our data transformation journey.
Key Responsibilities
Strategic Leadership & Roadmap
- End-to-End Tech Stack Ownership: Define, own, and evolve the complete data science and analytics technology stack to ensure scalability and performance.
- Business Roadmap & ROI: Develop and execute a data analytics-driven business roadmap, ensuring every initiative is aligned with organizational goals and delivers measurable Return on Investment (ROI).
- Executive Decision Support: Create and present high-impact executive decision packs, providing actionable insights that drive key business strategies.
Model Design & Deployment (Hands-on)
- Hands-on Development: Lead by example with hands-on involvement in AI modeling, machine learning model design, and algorithm development using Python.
- Deployment & Ops: Oversee and execute the deployment of models into production environments, ensuring reliability, scalability, and seamless integration with existing systems.
- Leverage expert-level knowledge of Google Cloud Agentic AI, Vertex AI and BigQuery to build advanced predictive models and data pipelines.
- Develop business dashboards for various sales channels and drive data driven decision making to improve sales and reduce costs.
Governance & Quality
- Data Governance: Establish and enforce robust data governance frameworks, ensuring data accuracy, security, consistency, and compliance across the organization.
- Best Practices: Champion best practices in coding, testing, and documentation to build a world-class data engineering culture.
Collaboration & Innovation
- Work closely with Product, Engineering, and Business leadership to identify opportunities for AI/ML intervention.
- Stay ahead of industry trends in AI, Generative AI, and financial modeling to keep Bajaj Capital at the forefront of innovation.
Must-Have Skills & Experience
Experience:
- At least 7 years of industry experience in Data Science, Machine Learning, or a related field.
- Proven track record of applying AI and leading data science teams or initiatives that resulted in significant business impact.
Technical Proficiency:
- Core Languages: Proficiency in Python is mandatory, with strong capabilities in libraries such as Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch.
- Cloud Data Stack: Expert-level command of Google Cloud Platform (GCP), specifically Agentic AI, Vertex AI and BigQuery.
- AI & Analytics Stack: Deep understanding of the modern AI and Data Analytics stack, including data warehousing, ETL/ELT pipelines, and MLOps.
- Visualization: PowerBI in combination with custom web/mobile applications.
Leadership & Soft Skills:
- Ability to translate complex technical concepts into clear business value for stakeholders.
- Strong ownership mindset with the ability to manage end-to-end project lifecycles.
- Experience in creating governance structures and executive-level reporting.
Good-to-Have / Plus
- Domain Expertise: Prior experience in the BFSI domain (Wealth Management, Insurance, Mutual Funds, or Fintech).
- Certifications: Google Professional Data Engineer or Google Professional Machine Learning Engineer certifications.
- Advanced AI: Experience with Generative AI (LLMs), RAG architectures, and real-time analytics.

Job Role: Teamcenter Admin
• Teamcenter and CAD (NX) Configuration Management
• Advanced debugging and root-cause analysis beyond L2
• Code fixes and minor defect remediation
• AWS knowledge, which is foundational to our Teamcenter architecture
• Experience supporting weekend and holiday code deployments
• Operational administration (break/fix, handle ticket escalations, problem management
• Support for project activities
• Deployment and code release support
• Hypercare support following deployment, which is expected to onboard approximately 1,000+ additional users
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
DataHavn IT Solutions is a company that specializes in big data and cloud computing, artificial intelligence and machine learning, application development, and consulting services. We want to be in the frontrunner into anything to do with data and we have the required expertise to transform customer businesses by making right use of data.
About the Role
We're seeking a talented and versatile Full Stack Developer with a strong foundation in mobile app development to join our dynamic team. You'll play a pivotal role in designing, developing, and maintaining high-quality software applications across various platforms.
Responsibilities
- Full Stack Development: Design, develop, and implement both front-end and back-end components of web applications using modern technologies and frameworks.
- Mobile App Development: Develop native mobile applications for iOS and Android platforms using Swift and Kotlin, respectively.
- Cross-Platform Development: Explore and utilize cross-platform frameworks (e.g., React Native, Flutter) for efficient mobile app development.
- API Development: Create and maintain RESTful APIs for integration with front-end and mobile applications.
- Database Management: Work with databases (e.g., MySQL, PostgreSQL) to store and retrieve application data.
- Code Quality: Adhere to coding standards, best practices, and ensure code quality through regular code reviews.
- Collaboration: Collaborate effectively with designers, project managers, and other team members to deliver high-quality solutions.
Qualifications
- Bachelor's degree in Computer Science, Software Engineering, or a related field.
- Strong programming skills in [relevant programming languages, e.g., JavaScript, Python, Java, etc.].
- Experience with [relevant frameworks and technologies, e.g., React, Angular, Node.js, Swift, Kotlin, etc.].
- Understanding of software development methodologies (e.g., Agile, Waterfall).
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
- Strong communication and interpersonal skills.
Preferred Skills (Optional)
- Experience with cloud platforms (e.g., AWS, Azure, GCP).
- Knowledge of DevOps practices and tools.
- Experience with serverless architectures.
- Contributions to open-source projects.
What We Offer
- Competitive salary and benefits package.
- Opportunities for professional growth and development.
- A collaborative and supportive work environment.
- A chance to work on cutting-edge projects.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
DataHavn IT Solutions is a company that specializes in big data and cloud computing, artificial intelligence and machine learning, application development, and consulting services. We want to be in the frontrunner into anything to do with data and we have the required expertise to transform customer businesses by making right use of data.
About the Role:
As a Data Scientist specializing in Google Cloud, you will play a pivotal role in driving data-driven decision-making and innovation within our organization. You will leverage the power of Google Cloud's robust data analytics and machine learning tools to extract valuable insights from large datasets, develop predictive models, and optimize business processes.
Key Responsibilities:
- Data Ingestion and Preparation:
- Design and implement efficient data pipelines for ingesting, cleaning, and transforming data from various sources (e.g., databases, APIs, cloud storage) into Google Cloud Platform (GCP) data warehouses (BigQuery) or data lakes (Dataflow).
- Perform data quality assessments, handle missing values, and address inconsistencies to ensure data integrity.
- Exploratory Data Analysis (EDA):
- Conduct in-depth EDA to uncover patterns, trends, and anomalies within the data.
- Utilize visualization techniques (e.g., Tableau, Looker) to communicate findings effectively.
- Feature Engineering:
- Create relevant features from raw data to enhance model performance and interpretability.
- Explore techniques like feature selection, normalization, and dimensionality reduction.
- Model Development and Training:
- Develop and train predictive models using machine learning algorithms (e.g., linear regression, logistic regression, decision trees, random forests, neural networks) on GCP platforms like Vertex AI.
- Evaluate model performance using appropriate metrics and iterate on the modeling process.
- Model Deployment and Monitoring:
- Deploy trained models into production environments using GCP's ML tools and infrastructure.
- Monitor model performance over time, identify drift, and retrain models as needed.
- Collaboration and Communication:
- Work closely with data engineers, analysts, and business stakeholders to understand their requirements and translate them into data-driven solutions.
- Communicate findings and insights in a clear and concise manner, using visualizations and storytelling techniques.
Required Skills and Qualifications:
- Strong proficiency in Python or R programming languages.
- Experience with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, Cloud Dataproc, and Vertex AI.
- Familiarity with machine learning algorithms and techniques.
- Knowledge of data visualization tools (e.g., Tableau, Looker).
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
- Strong communication and interpersonal skills.
Preferred Qualifications:
- Experience with cloud-native data technologies (e.g., Apache Spark, Kubernetes).
- Knowledge of distributed systems and scalable data architectures.
- Experience with natural language processing (NLP) or computer vision applications.
- Certifications in Google Cloud Platform or relevant machine learning frameworks.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
About the Role:
We are seeking a talented Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.
Responsibilities:
- Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
- Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
- Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
- Team Management: Able to handle team.
- Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
- Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
- Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
- Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.
Skills:
- Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
- Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
- Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
- Understanding of data modeling and data architecture concepts.
- Experience with ETL/ELT tools and frameworks.
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
Preferred Qualifications:
- Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
- Knowledge of machine learning and artificial intelligence concepts.
- Experience with data visualization tools (e.g., Tableau, Power BI).
- Certification in cloud platforms or data engineering.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
We are seeking a skilled Magic xpa Developer with strong proficiency in .NET technologies to design, develop, and maintain enterprise applications. The ideal candidate will have hands-on experience with Magic xpa (formerly uniPaaS / eDeveloper), integration with .NET components, and solid knowledge of database systems and web services.
Key Responsibilities
- Develop, enhance, and maintain business applications using Magic xpa Application Platform.
- Integrate Magic xpa applications with .NET modules, APIs, and external systems.
- Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
- Design and implement data integration, business logic, and UI functionalities within Magic xpa and .NET frameworks.
- Debug, troubleshoot, and optimize application performance.
- Participate in code reviews, unit testing, and deployment activities.
- Work with SQL Server or other RDBMS for data modeling, stored procedures, and performance tuning.
- Ensure adherence to coding standards, security practices, and documentation requirements.
Required Skills & Experience
- 5+ years of experience in Magic xpa (version 3.x or 4.x) development.
- Strong proficiency in C#, .NET Framework / .NET Core, and Visual Studio.
- Experience integrating Magic xpa applications with REST/SOAP APIs.
- Hands-on experience with SQL Server / Oracle, including complex queries and stored procedures.
- Good understanding of software development lifecycle (SDLC), agile methodologies, and version control tools (Git, TFS).
- Ability to troubleshoot runtime and build-time issues in Magic xpa and .NET environments.
- Excellent analytical, problem-solving, and communication skills.
Nice to Have
- Experience with Magic xpi (integration platform).
- Knowledge of web technologies (ASP.NET, HTML5, JavaScript).
- Exposure to cloud environments (AWS / Azure).
- Understanding of enterprise system integration and microservices architecture
Similar companies
About the company
Fractal is one of the most prominent players in the Artificial Intelligence space.Fractal's mission is to power every human decision in the enterprise and brings Al, engineering, and design to help the world's most admire Fortune 500® companies.
Fractal's products include Qure.ai to assist radiologists in making better diagnostic decisions, Crux Intelligence to assist CEOs and senior executives make better tactical and strategic decisions, Theremin.ai to improve investment decisions, Eugenie.ai to find anomalies in high-velocity data, Samya.ai to drive next-generation Enterprise Revenue Growth Manage- ment, Senseforth.ai to automate customer interactions at scale to grow top-line and bottom-line and Analytics Vidhya is the largest Analytics and Data Science community offering industry-focused training programs.
Fractal has more than 3600 employees across 16 global locations, including the United States, UK, Ukraine, India, Singapore, and Australia. Fractal has consistently been rated as India's best companies to work for, by The Great Place to Work® Institute, featured as a leader in Customer Analytics Service Providers Wave™ 2021, Computer Vision Consultancies Wave™ 2020 & Specialized Insights Service Providers Wave™ 2020 by Forrester Research, a leader in Analytics & Al Services Specialists Peak Matrix 2021 by Everest Group and recognized as an "Honorable Vendor" in 2022 Magic Quadrant™™ for data & analytics by Gartner. For more information, visit fractal.ai
Jobs
0
About the company
CAW Studios is Product Development Studio. WE BUILD TRUE PRODUCT TEAMS for our clients. Each team is a small, well-balanced group of geeks and a product manager that together produce relevant and high-quality products. We use data to make decisions, bringing big data and analysis to software development. We believe the product development process is broken as most studios operate as IT Services. We operate like a software factory that applies manufacturing principles of product development to the software.
Jobs
13
About the company
Jobs
23
About the company
Tarento is a unique blend of Nordic efficiency and Indian technological depth, operating from offices in Sweden, Finland, Norway, and India. With its largest technology hub in Bangalore and strong consulting and data management capabilities in Finland, the company helps organisations navigate digital transformation with no-hassle, high-quality services across enterprise applications, data & information management, and custom engineering solutions.
Founded in 2009 and strengthened through strategic milestones—including becoming part of the Acando Group and acquiring Finnish data management consultancy DATPRO—Tarento has built a comprehensive capability stack that bridges technology and business. Today, the company supports clients across the Nordics and India, solving core business challenges through a powerful combination of advanced technology skills, deep data expertise, and reliable long-term application management services.
Jobs
3
About the company
Nudge is a user experience platform for consumer companies to help them activate, retain, and understand users.
We’re backed by Antler, and marquee angels including Kunal Shah (Cred), Dhruv Bahl (BharatPe), Bharati Balakrishnan(Shopify), Prashant Pitti (Easemytrip), Pallav Nadhani (FusionCharts), and Ajinkya Kulkarni (WintWealth).
Who We Are
We're on a mission to revolutionize how companies deliver personalized experiences to their users. At Nudge, we've built a powerful SaaS platform that enables consumer companies to run rapid product experiments and deliver truly personalized user experiences. What makes us special? We help product and growth teams build native UX components without needing any development effort – yes, you heard that right!
Our Secret Sauce
We're proud of our comprehensive suite of tools that work together seamlessly:
- Want to create meaningful interactions? Our dynamic UX library has got you covered
- Need to experiment with UI in real-time? Jump into our visual builder
- Looking to create personalized customer journeys? Our user flows are here to help
- Want to incentivize actions? Our reward engine makes it simple
- Plus, we've got powerful experimentation capabilities and analytics tools to track it all
The best part? We integrate with over 25 popular tools like Mixpanel, Amplitude, Segment, and Braze. We believe in making your existing tech stack work better, not replacing it!
Our Team
We're led by our passionate co-founders, Kanishka Thakur (CEO) and Gaurav Rawat. Under Kanishka's leadership, we're pushing the boundaries of what's possible in personalization and experimentation.
Our Growth Story
We're backed by some amazing investors who believe in our vision. Our journey started with initial backing from GradCapital, and we've since received support from Antler. We're excited to share that we recently closed our Pre-Seed round in 2024, which is helping us accelerate our mission even faster!
What Makes Us Different:
We understand that every user is unique, and their experience should be too. That's why we've built our platform to personalize across multiple dimensions - content, timing, incentives, and UI. Whether you're running A/B tests, creating personalized journeys, or analyzing user behavior, we make it simple and effective.
Want to see how we can transform your user experiences? We'd love to show you around our platform - check out our website or the demo below!
Jobs
1
About the company
At Hunarstreet Technologies Pvt Ltd, we specialize in delivering India’s fastest hiring solutions, tailored to meet the unique needs of businesses across various industries. Our mission is to connect companies with exceptional talent, enabling them to achieve their growth and operational goals swiftly and efficiently.
We are able to achieve a success rate of 87% in relevancy of candidates to the job position and 62% success rate in closing positions shared with us.
Jobs
647
About the company
Jobs
3
About the company
Jobs
11
About the company
Aegion is building the future of workplace automation through intelligent AI agents. We’re creating a new category of workforce solutions that enable companies to? hire AI agents to handle specific job functions, starting as interns and evolving into full contributors within organizations.
We develop specialized AI workforce agents that can be hired just like human employees. These agents integrate seamlessly into existing company workflows, have access to relevant company data and tools, and can perform 60-80% of the work typically done by human employees in specific roles.
Jobs
1
About the company
Jobs
2








