
Someone who has prior experience in Product Analytics/Marketing Analytics/Growth Analytics/Consumer analytics domain.
Has the ability to define success metrics for different initiatives or campaigns, drive ROI analysis & provide recommendations.
Strong analytics acumen and problem solving skills.
Good understanding of Clickstream/App events data, ability to build and understand user journeys across platforms.
Understanding of LTV modelling
Strong SQL skills and comfortable with python programming.
Basic knowledge of different regression models & clustering techniques.

Similar jobs
Hello
Hope you are doing great
We have an Urgent & high-priority global opening with a leading consulting organization in Qatar. Immediate interview slots available. We are looking for a Senior Data & AI Consultant with strong Azure expertise to work in a highly client-facing consulting role.
🔹 Quick Role Overview
- Role: Senior Data & AI Consultant
- Location: Doha, Qatar (Onsite)
- Employment Type: Full-time
- Travel: GCC travel required
- Experience Required: 8+ years
- Domain: Data Strategy, Data Engineering, AI/ML, Consulting
- Industry Exposure: Public Sector, BFSI, Oil & Gas, Government, Healthcare, Utilities
📌 Role Summary
The Senior Data & AI Consultant will play a critical role in driving enterprise Data & AI transformations across clients. This role blends data strategy, architecture, engineering, governance, and AI/ML delivery with strong consulting, stakeholder management, and pre-sales responsibilities.
You will work closely with CXOs, business leaders, and technical teams to design scalable Azure-based data platforms, develop advanced analytics and AI solutions, and guide organizations on responsible AI adoption.
🎯 Key Responsibilities
1. Data Strategy & Advisory
- Develop enterprise-wide Data & AI strategies and transformation roadmaps
- Perform data maturity assessments and define improvement plans
- Lead CXO-level workshops aligning business goals with Data & AI initiatives
- Guide clients on responsible and ethical AI practices
2. Data Architecture & Engineering
- Design scalable data platforms on Microsoft Azure
- Architect and implement batch and real-time pipelines using ADF, Synapse, Databricks, Event Hubs
- Build enterprise data models and modern lakehouse architectures
- Implement data quality, metadata, and lineage solutions using Azure Purview
- Partner with AI teams to ensure data readiness for ML/AI use cases
3. AI & Advanced Analytics
- Lead end-to-end AI/ML solution development using Azure ML, OpenAI, Cognitive Services
- Build ML and Generative AI solutions (LLMs, RAG, prompt engineering)
- Implement MLOps practices (CI/CD, deployment, monitoring, governance)
- Ensure Responsible AI – explainability, fairness, and compliance
- Collaborate across the full AI lifecycle: data sourcing → modeling → deployment
4. Consulting, Pre-Sales & Leadership
- Lead client workshops, PoCs, demos, and discovery sessions
- Support proposal development and solution architecture design
- Contribute to thought leadership (blogs, whitepapers, events)
- Mentor junior consultants and ensure delivery excellence
🧠 Desired Skills & Competencies
✅ Must-Have
- 8+ years of experience across Data, Analytics, and/or AI roles
- Strong hands-on experience in the Azure ecosystem
- Proven consulting and client-facing experience
- Exposure to at least one specialization:
- Data Management
- Data Governance
- AI/ML Engineering
🔧 Technical Expertise (Any one or more areas)
A. Data Management
- Azure Data Factory, Synapse, Data Lake, Databricks, Event Hubs
- Strong SQL, Python, Spark
- Data modeling, ETL/ELT, data quality frameworks
- Exposure to Microsoft Fabric / Airflow / Kafka
B. Data Governance & Strategy
- Knowledge of DAMA-DMBOK, CMMI-DMM
- Experience with Informatica (Axon, IDQ, EDC, IDMC)
- Governance operating models, stewardship, privacy frameworks
C. AI/ML Engineering
- Python, TensorFlow, PyTorch, Scikit-learn, Azure ML
- Generative AI, LLMs, RAG pipelines
- Experience in at least 3 domains: NLP, CV, predictive modeling, recommender systems, anomaly detection
- Strong MLOps exposure
🤝 Consulting & Leadership Skills
- Excellent stakeholder and CXO communication
- Strong problem-solving and business translation skills
- Workshop facilitation, proposal writing, client presentations
- Mentoring and technical leadership capabilities
🎓 Education & Certifications
Required Qualification:
- BCA / B.Sc. (Computer Science) / B.E. / B.Tech
- MCA / M.E. / M.Tech
AI Agent Builder – Internal Functions and Data Platform Development Tools
About the Role:
We are seeking a forward-thinking AI Agent Builder to lead the design, development, and deployment, and usage reporting of Microsoft Copilot and other AI-powered agents across our data platform development tools and internal business functions. This role will be instrumental in driving automation, improving onboarding, and enhancing operational efficiency through intelligent, context-aware assistants.
This role is central to our GenAI transformation strategy. You will help shape the future of how our teams interact with data, reduce administrative burden, and unlock new efficiencies across the organization. Your work will directly contribute to our “Art of the Possible” initiative—demonstrating tangible business value through AI.
You Will:
• Copilot Agent Development: Use Microsoft Copilot Studio and Agent Builder to create, test, and deploy AI agents that automate workflows, answer queries, and support internal teams.
• Data Engineering Enablement: Build agents that assist with data connector scaffolding, pipeline generation, and onboarding support for engineers.
• Knowledge Base Integration: Curate and integrate documentation (e.g., ERDs, connector specs) into Copilot-accessible repositories (SharePoint, Confluence) to support contextual AI responses.
• Prompt Engineering: Design reusable prompt templates and conversational flows to streamline repeated tasks and improve agent usability.
• Tool Evaluation & Integration: Assess and integrate complementary AI tools (e.g., GitLab Duo, Databricks AI, Notebook LM) to extend Copilot capabilities.
• Cross-Functional Collaboration: Partner with product, delivery, PMO, and security teams to identify high-value use cases and scale successful agent implementations.
• Governance & Monitoring: Ensure agents align with Responsible AI principles, monitor performance, and iterate based on feedback and evolving business needs.
• Adoption and Usage Reporting: Use Microsoft Viva Insights and other tools to report on user adoption, usage and business value delivered.
What We're Looking For:
• Proven experience with Microsoft 365 Copilot, Copilot Studio, or similar AI platforms, ChatGPT, Claude, etc.
• Strong understanding of data engineering workflows, tools (e.g., Git, Databricks, Unity Catalog), and documentation practices.
• Familiarity with SharePoint, Confluence, and Microsoft Graph connectors.
• Experience in prompt engineering and conversational UX design.
• Ability to translate business needs into scalable AI solutions.
• Excellent communication and collaboration skills across technical and non-technical
Bonus Points:
• Experience with GitLab Duo, Notebook LM, or other AI developer tools.
• Background in enterprise data platforms, ETL pipelines, or internal business systems.
• Exposure to AI governance, security, and compliance frameworks.
• Prior work in a regulated industry (e.g., healthcare, finance) is a plus.
We’re looking for a Java Backend Developer with strong experience in Spring Boot, AWS, and Microservices to join our growing team. If you're passionate about scalable backend systems and love working in a fast-paced environment, we want to hear from you!
What You’ll Bring:
- 4+ years of backend development experience
- Strong hands-on expertise in Core Java and Spring Boot
- Experience designing and developing Microservices architecture
- Solid working knowledge of AWS services
- Familiarity with RESTful APIs, version control (Git), and CI/CD
- Strong problem-solving and debugging skills
- Ability to work independently and in a collaborative team environment
- Immediate joiners are highly preferred
We are looking for an experienced Java Developer with strong proficiency in Kafka and MongoDB to join our dynamic team. The ideal candidate will have a solid background in designing and developing high-performance, scalable, and reliable applications in a microservices architecture. You will be responsible for building real-time data processing systems, integrating various services, and ensuring smooth data flow across systems.
Key Responsibilities:
- Design, develop, and maintain scalable Java applications with a focus on performance and reliability.
- Build and maintain Kafka-based real-time data pipelines for handling high-volume, low-latency data.
- Work with MongoDB to design and optimize database schemas and queries for high throughput and availability.
- Collaborate with cross-functional teams to define, design, and implement new features and improvements.
- Troubleshoot and resolve issues related to system performance, scalability, and reliability.
- Ensure software quality through best practices, including testing, code reviews, and continuous integration.
- Implement and maintain security best practices in both code and data handling.
- Participate in agile development cycles, including sprint planning, daily standups, and retrospectives.
Required Skills & Qualifications:
- 7+ years of experience in Java development, with a strong understanding of core Java concepts (J2EE, multithreading, etc.).
- Hands-on experience with Apache Kafka, including setting up brokers, producers, consumers, and understanding Kafka Streams.
- Proficient in working with MongoDB for designing efficient data models, indexing, and optimizing queries.
- Experience with microservices architecture and RESTful APIs.
- Familiarity with containerization technologies like Docker and orchestration tools like Kubernetes is a plus.
- Strong understanding of distributed systems, message-driven architectures, and event streaming.
- Familiarity with version control systems like Git.
- Excellent problem-solving skills, with the ability to debug and optimize code for high-performance systems.
- Experience with CI/CD pipelines and automated testing.
A) Skills Required
Essential Skills (Two top
skills)
3 possible combinations.
1. Candidate having expertise in both ElasticSearch and Kafka, preferably.
OR
2. Candidate having expertise in ElasticSearch and willing to learn Kafka.
OR
3. Candidate having expertise in Kafka and willing to learn ElasticSearch.
B) Other Information
Educational Qualifications Graduate
Experience Mid-Level (6+ years)
Minimum Qualifications:
ElasticSearch/OpenSearch
· Software Lifecycle/programing skills
· Linux
· Python
· Ingestion tools (logstash, OpenSearch Ingestion, Fluentd, fluentbit, Harness,
CloudFormation, container, images, ECS, lambda).
· SQL query
· Json
· AWS knowledge
Kafka/MSK
· Linux
· In-depth understanding of Kafka broker configurations, zookeepers, and
connectors
· Understand Kafka topic design and creation.
· Good knowledge in replication and high availability for Kafka system
· Good understanding of producers and consumer group
· Understanding Kafka partitions and scaling up
· Kafka latency/lag and throughput
· Integrating Kafka connect with various data sources can be internal or external.
· Kafka security using SSL/Certs
• 6 - 8 years of software development / consulting / support experience is required.
• Experience with one of these databases - Neo4j, Oracle, MS SQL Server, PostgreSQL – is required.
• Strong understanding of relational and / or graph database concepts is required.
• Experience working with public cloud technologies specifically AWS, Azure & GCP.
• Experience working with RedHat / CentOS / Amazon Linux operating systems will be a plus.
• Competence in at least one of the following languages (in no particular order): Java, C++, C#, Python, Scala
• Experience with ETL tools ingesting / cleansing / transforming data from multiple data sources will be a plus.
• Knowledge / Hands on experience in Kettle will be a plus.
• Experience with Customer Support for databases primarily on Linux OS.
• Familiarity with enterprise software architectures and application development methodologies is required.
• Familiarity with Linux diagnostics and tuning; Windows tuning knowledge will be a plus.
• Performance Tuning and Troubleshooting skills are required.
We are looking for Full-stack engineers with a minimum of 2 to 3+ years of exposure to Node JS, Express JS, Mongo DB, and React JS to join our team.
At Aryston Web Solution Pvt. Ltd, you will use the latest software development languages, techniques, and approaches and work with a visionary team to build solutions you can be proud of.
Your role and responsibilities
- Creating solid back-end code with Node JS.
- Writing code in async-await architecture.
- Integrating REST APIs in front-end application using React JS
- Writing mongoose schema efficiently can implement MRC architecture in the express framework
- Implementing business logic in front-end applications using javascript, should be able to handle complex JSON architecture.
- Should be able to manage states using Redux, Redux thunk/saga, Mobx.
- Should be comfortable in using both functional and class components in React.
- Must be familiar with popular React UI libraries like Material UI, Antd.
- Implement best practices and constantly learn new ways of keeping codebases up to date.
- Having knowledge in Typescript, GraphQL, NextJS, Socket.io is like a match made in heaven.
- AWS SDK hands-on experience, Firebase Services experience, Oauth 2.0 handling knowledge are best to have.
Criteria
- 2+ years of work experience with Node JS, Mongo DB Javascript and React JS.
- Proficient understanding of code versioning tools, such as Git.
- You care about the design and user experience of an application. The choice of third-party libraries should be elegant.
- You must respect the timeline of the deliverables and releases.
Why Join Us?
Aryston Web Solution builds robust solutions for clients in 10+ sectors including cybersecurity, education, healthcare, hospitality, insurance, transportation which makes a positive impact in 5+ countries in the world.
We also have our B2B SAAS product and catering a wide range of clients through it.
At Aryston -
- Your work gets recognition from day 1.
- Since we work very closely, We honor the decision and opinions of our colleagues.
- We grow, you grow.
- We explore and learn new things together.
- Fixed working hours.
- We follow proper productive techniques to get things done, so there are fewer phone calls or boring meetings, and of course zero blame game.
Kolkata (West Bengal)
Role Description:
- Lister is actively engaged with client across digital transformation and engineering
- As a part of this engagement, you will be working with our clients on solution design and architecture, technology decisions, while co-ordinating technical efforts for the team
- As a Technical Lead:, You will need to collaborate with Solution Architects define technology
- roadmap, evaluate/recommend technologies, architect and design solutions, Work
- with other developers to oversee the development and implementation
- You will translate functional requirements from business stakeholders to the technical
- team for implementation
- You will need to troubleshoot design flaws and system bottlenecks
- You will have to be hands-on with code reviews and evaluate implementation for performance and scale
- Manage communication between our clients and internal teams in India and USA
- Coordinate with the offshore team to deliver projects across work streams, Identify risks, issues and blockers to progress and mitigate them for the team
- Rapidly gain an understanding of project requirements and drive execution
- Work with delivery team to ensure the highest quality of materials are being produced and all client needs are met.
Skills Required:
- Excellent understanding of backend and frontend components in the following stack: NodeJS, TypeScript, Javascript, ReactJS, multiple RDBMS and NoSQL databases like Postgres, MySQL, MongoDB Experience of working on AWS services (Lambda, Step functions, SQS, S3 SNS)
- Microservice design pattern is a must
- Experience in Docker, Kubernetes, containerization and CI/CD pipelines is preferable
- Experience in software development tools like Prettier, CodeCov, Snyk, etc..
- Excellent verbal and written communication skills; client facing and interpersonal skills; must be a listener, a presenter, and a people-person
- Experience in Designing scalable architecture systems for Node-based applications
- Ability to grasp complex implementations; Proven results of delivering client solutions
- Should have experience of working with offshore team
- Experience in working on large scale technology transformation projects for product companies is preferable
- Technology contributions to open source projects will be an added advantage
- Should have completed Certifications or training on AWS (AWS Solutions Architect), NodeJS, front-end frameworks.
Experience: 9 to 16 Yrs
Notice Period: Immediate to 15 days
Job Description:
- The Lead Engineer is responsible for deriving the technical architecture, detailed design, implementation, Performance, and Scalability of the developed system.
- The individual should be capable of single-handedly understanding the requirement, liaison with the product owner and external teams; produce the technical output required.
- Strong Leadership skills to contradict, produce a new design to current trends is required.
- Should be an avid technology enthusiast with strong views on technical architecture.
Technical Skills:
1) Strong Python and Mongo DB is mandatory
2) Web Application Architecture - One Large Project Experience of Development & Maintenance
3) Knowledge in one or more frameworks like WSGI(uwsgi), Django, Flask, Web Application development
4) Exposure to Databases like MongoDB, BigQuery, Elastic Search, Redis, etc.,
5) Knowledge of Code Quality Process and Discipline (Git, Jenkins, etc)









