
As one of the members of the Design & Production team at Mosaic, you will be playing a key
role in all motion and video related work spanning across brand related social media
content, website and app.
You will be the centre of execution for all motion design and animation related work.
Ranging from creating mockups and moodboards, to creating animations and videos for
different marketing collaterals
You will be a good fit if you -
● Can think of design from a first principles understanding and can bring about solutions
that really reflect thoughtful considerations rather than being only a software expertise
result.
● Have a strong portfolio of high quality motion and animation work
● Can structure your projects into reusable components, modular assets and take initiatives
that highly bring down the time to execution
● Have good theoretical understanding of color theory, typography, illustration and print
design
● Are extremely adept at After Effects, Adobe Premier and other video tools
● Can storyboard any communication piece and bring it to life with motion
Your core responsibilities will be -
● Working within the design team to bring about the best suited solutions to brand
problems around video content
● Owning the entire execution pipeline - especially from a production standpoint
● Creating high quality animated videos and motion frames for all brand related marketing
collaterals
● Ensuring timely execution and adherence to processes
Minimum qualifications -
● A good portfolio of previous work samples
● 2 - 4 years of relevant experience
● Good working knowledge of all relevant tools and Photoshop / Illustrator

About Mosaic Wellness
About
Connect with the team
Similar jobs
Company Description
VMax e-Solutions India Private Limited, based in Hyderabad, is a dynamic organization specializing in Open Source ERP Product Development and Mobility Solutions. As an ISO 9001:2015 and ISO 27001:2013 certified company, VMax is dedicated to delivering tailor-made and scalable products, with a strong focus on e-Governance projects across multiple states in India. The company's innovative technologies aim to solve real-life problems and enhance the daily services accessed by millions of citizens. With a culture of continuous learning and growth, VMax provides its team members opportunities to develop expertise, take ownership, and grow their careers through challenging and impactful work.
About the Role
We’re hiring a Senior Data Scientist with deep real-time voice AI experience and strong backend engineering skills.
1. You’ll own and scale our end-to-end voice agent pipeline that powers AI SDRs, customer support 2. agents, and internal automation agents on calls. This is a hands-on, highly technical role where you’ll design and optimize low-latency, high-reliability voice systems.
3. You’ll work closely with our founders, product, and platform teams, with significant ownership over architecture, benchmarks.
What You’ll Do
1. Own the voice stack end-to-end – from telephony / WebRTC entrypoints to STT, turn-taking, LLM reasoning, and TTS back to the caller.
2. Design for real-time – architect and optimize streaming pipelines for sub-second latency, barge-in, interruptions, and graceful recovery on bad networks.
3. Integrate and tune models – evaluate, select, and integrate STT/TTS/LLM/VAD providers (and self-hosted models) for different use-cases, balancing quality, speed, and cost.
4. Build orchestration & tooling – implement agent orchestration logic, evaluation frameworks, call simulators, and dashboards for latency, quality, and reliability.
5. Harden for production – ensure high availability, observability, and robust fault-tolerance for thousands of concurrent calls in customer VPCs.
6. Shape the voice roadmap – influence how voice fits into our broader Agentic OS vision (simulation, analytics, multi-agent collaboration, etc.).
You’re a Great Fit If You Have
1. 6+ years of software engineering experience (backend or full-stack) in production systems.
2. Strong experience building real-time voice agents or similar systems using:
STT / ASR (e.g. Whisper, Deepgram, Assembly, AWS Transcribe, GCP Speech)
TTS (e.g. ElevenLabs, PlayHT, AWS Polly, Azure Neural TTS)
VAD / turn-taking and streaming audio pipelines
LLMs (e.g. OpenAI, Anthropic, Gemini, local models)
3. Proven track record designing and operating low-latency, high-throughput streaming systems (WebRTC, gRPC, websockets, Kafka, etc.).
4. Hands-on experience integrating ML models into live, user-facing applications with real-time inference & monitoring.
5. Solid backend skills with Python and TypeScript/Node.js; strong fundamentals in distributed systems, concurrency, and performance optimization.
6. Experience with cloud infrastructure – especially AWS (EKS, ECS, Lambda, SQS/Kafka, API Gateway, load balancers).
7. Comfortable working in Kubernetes / Docker environments, including logging, metrics, and alerting.
8. Startup DNA – at least 2 years in an early or mid-stage startup where you shipped fast, owned outcomes, and worked close to the customer.
Nice to Have
1. Experience self-hosting AI models (ASR / TTS / LLMs) and optimizing them for latency, cost, and reliability.
2. Telephony integration experience (e.g. Twilio, Vonage, Aircall, SignalWire, or similar).
3. Experience with evaluation frameworks for conversational agents (call quality scoring, hallucination checks, compliance rules, etc.).
4. Background in speech processing, signal processing, or dialog systems.
5. Experience deploying into enterprise VPC / on-prem environments and working with security/compliance constraints.
About Us:
Tradelab Technologies Pvt Ltd is not for those seeking comfort—we are for those hungry to make a mark in the trading and fintech industry. If you are looking for just another backend role, this isn’t it. We want risk-takers, relentless learners, and those who find joy in pushing their limits every day. If you thrive in high-stakes environments and have a deep passion for performance driven backend systems, we want you.
• We’re looking for a Backend Developer (Python) with a strong foundation in backend technologies and a deep interest in scalable, low-latency systems.
• You should have 3–4 years of experience in Python-based development and be eager to solve complex performance and scalability challenges in trading and fintech applications.
• You measure success by your own growth, not external validation.
• You thrive on challenges, not on perks or financial rewards.
• Taking calculated risks excites you—you’re here to build, break, and learn.
• You don’t clock in for a paycheck; you clock in to outperform yourself in a high-frequency trading
environment.
• You understand the stakes—milliseconds can make or break trades, and precision is everything.
• Develop and maintain scalable backend systems using Python.
• Design and implement REST APIs and socket-based communication.
• Optimize code for speed, performance, and reliability.
• Collaborate with frontend teams to integrate server-side logic.
• Work with RabbitMQ, Kafka, Redis, and Elasticsearch for robust backend design.
• Build fault-tolerant, multi-producer/consumer systems.
What We Expect:
• 3–4 years of experience in Python and backend development.
• Strong understanding of REST APIs, sockets, and network protocols (TCP/UDP/HTTP).
• Experience with RabbitMQ/Kafka, SQL & NoSQL databases, Redis, and Elasticsearch.
• Bachelor’s degree in Computer Science or related field.
Job Description: NPCI BBPS / VISA / RuPay Partnership Program Specialist
Role: NPCI Partnership & Integration Specialist
Location:noida
Experience: 5–10 years in payments / banking / fintech domain
Key Responsibilities
- Drive partnership initiatives with NPCI, BBPS, VISA, NADA, and RuPay under regulatory and business frameworks.
- Manage end-to-end BBPS onboarding process (Biller/Agent/Operating Unit).
- Handle compliance, certification, and audit requirements as per NPCI and RBI guidelines.
- Coordinate with technical teams for API integration (BBPS, VISA, RuPay) and ensure smooth deployment.
- Build business cases for new bill categories, payment products, and recurring payments.
- Work with internal stakeholders to design settlement, reconciliation, and reporting processes.
- Maintain strong relationships with NPCI, card networks, and other ecosystem partners.
- Stay updated on NPCI circulars, regulatory changes, and new partnership programs.
Key Skills & Competencies
- Strong knowledge of NPCI platforms – BBPS, UPI, NACH, RuPay.
- Hands-on experience with BBPS integrations (via BBPOU/BOU APIs, Setu, BillAvenue, or direct).
- Familiarity with card schemes (VISA / RuPay) and recurring payments.
- Deep understanding of RBI guidelines, compliance frameworks, and certification processes.
- Excellent stakeholder management (with NPCI, banks, fintechs).
- Strong project management skills for partnership programs.
- Technical knowledge of payment APIs, settlement flows, and reconciliation tools is a plus.
Preferred Background
- Worked in fintechs / PSPs / bill payment aggregators (e.g., Setu, BillAvenue, PayU, Paytm, PhonePe, Razorpay).
- Prior experience with NPCI sandbox certification.
- Exposure to bank partnerships, merchant acquiring, and biller onboarding.
We are looking for an experienced US IT Recruiter to join our panel and handle sourcing of freelance interviewers and coordinating technical interviews for our clients in the US market. This role is ideal for someone with strong networking, recruitment, and stakeholder management skills in the US IT hiring domain.
Responsibilities:
● Source and onboard freelance technical interviewers across various IT domains (Java, Python, Data Engineering, Cloud, etc.) for US time zones.
● Coordinate end-to-end interview scheduling between clients and panel interviewers.
● Maintain & update the interviewer database to ensure quick availability for projects.
● Understand client requirements and match the right technical panel for the role.
● Follow up with interviewers & clients for timely completion of interviews.
● Ensure compliance with client expectations and InCruiter’s quality standards.
Must Have:
● 2–5 years of experience in US IT recruitment.
● Strong understanding of US time zones, work culture, and hiring processes.
● Experience in freelance/contract recruitment and panel sourcing preferred.
● Excellent networking, negotiation, and relationship-building skills.
● Proficient in using ATS, LinkedIn, and job boards for talent acquisition.
● Strong communication skills in English (both verbal & written).
● Ability to work independently and meet deadlines.
Job Description
Job Title: Data Engineer
Location: Hyderabad, India
Job Type: Full Time
Experience: 5 – 8 Years
Working Model: On-Site (No remote or work-from-home options available)
Work Schedule: Mountain Time Zone (3:00 PM to 11:00 PM IST)
Role Overview
The Data Engineer will be responsible for designing and implementing scalable backend systems, leveraging Python and PySpark to build high-performance solutions. The role requires a proactive and detail-orientated individual who can solve complex data engineering challenges while collaborating with cross-functional teams to deliver quality results.
Key Responsibilities
- Develop and maintain backend systems using Python and PySpark.
- Optimise and enhance system performance for large-scale data processing.
- Collaborate with cross-functional teams to define requirements and deliver solutions.
- Debug, troubleshoot, and resolve system issues and bottlenecks.
- Follow coding best practices to ensure code quality and maintainability.
- Utilise tools like Palantir Foundry for data management workflows (good to have).
Qualifications
- Strong proficiency in Python backend development.
- Hands-on experience with PySpark for data engineering.
- Excellent problem-solving skills and attention to detail.
- Good communication skills for effective team collaboration.
- Experience with Palantir Foundry or similar platforms is a plus.
Preferred Skills
- Experience with large-scale data processing and pipeline development.
- Familiarity with agile methodologies and development tools.
- Ability to optimise and streamline backend processes effectively.
Job location: Indore, M.P( work from Office )
Experience :- 6 months experience on flutter Mobile app development
Number of developer needed : 3 developer
****
Key Responsibilities : -
1) Should have Working expertise on flutter Mobile app development for both design and development.
2) Should have Expertise in dynamic mobile app on flutter
3) Should be quick learner, immediate joiner.
***
Interested Candidates can directly walkins for the interview between 10:30 A.M to 7:00 p.m ( Monday to Saturday)
Company name : Logical SoftTech
Email : - hrlogicalsofttech(@)gmail.com, hrlogicalcoders(@)gmail.com
Address:- 201, B.N House, EC-52, behind Muli wala family restaurant, near Bombay Hospital, Scheme 94, Indore, Madhya Pradesh 452010
Contact : - +91-78.69.73.15.95 (HR department), +91- 74.15.95.09.19 (HR Department) , +91-821.02.51.824 (technical Department)
BDM CUM Business Analyst
Note: Notice period is immediate to 15 days’ maximum candidates can apply.
Kindly read carefully
Role: BDM CUM Business Analyst
Work Location: LUCKNOW
Experience: Minimum 2 + years into the same key skills
Budget: open
Mode: Permanent
Job description
We are looking for a candidate with 2-5 years of experience in Business Analysis and IT project coordination with documentation in IT (Web development/application development / mobile apps development) along with international client handling experience..
Job Responsibilities:
Discussion with customer/project team to understand the requirements
Creating, analyzing, and validating detailed functional and business specifications (BRD / SRS Preparation)
Requirement walk-through and getting requirement sign-off from the customer
Analyzing the requirements of the clients and converting them into technical documents as per the specifications
Creating wireframes to determine the flow and functionalities of the application
Facilitating the negotiation of requirements amongst multiple stakeholders
Creating Proposals and other relevant documents
Creating Slide Decks required for explaining Business Solutions & Approach
Explaining the requirements to technical teams and taking the estimates for the same
Understanding the Adhoc changes and analysis on the impact and creating spec documents
Supporting testing in creating test cases and review of test cases
Allocating tasks to resources as per the milestones of the project
Coordination with the delivery team to deliver the project on or before the estimated deadline
Creating Release documents and UAT Testing of the release
Communicating with clients throughout the life cycle of a project to update them with the progress and path ahead from time to time
Skills:
- Exceptional written and verbal communication, and client-serving skills
- Must have knowledge of some analytical tools
- Ability to prioritize tasks and deliver to deadlines
- Strong relationship building, analytical, & problem-solving skills
- Must have good decision-making skills
- Must be a good team player, good learner, and self-starter
- Proactive and quick executor
- Must have demonstrable client/stakeholder management skills
- Must be self-driven with the ability to manage workload without direct supervision
Job responsibilities
- You will partner with teammates to create complex data processing pipelines in order to solve our clients' most complex challenges
- You will collaborate with Data Scientists in order to design scalable implementations of their models
- You will pair to write clean and iterative code based on TDD
- Leverage various continuous delivery practices to deploy, support and operate data pipelines
- Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available
- Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions
- Create data models and speak to the tradeoffs of different modeling approaches
- Seamlessly incorporate data quality into your day-to-day work as well as into the delivery process
- Assure effective collaboration between Thoughtworks' and the client's teams, encouraging open communication and advocating for shared outcomes
- You have a good understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, and Hadoop
- You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive, Oozie, and Airflow in a production setting
- Hands on experience in MapR, Cloudera, Hortonworks and/or cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions
- You are comfortable taking data-driven approaches and applying data security strategy to solve business problems
- Working with data excites you: you can build and operate data pipelines, and maintain data storage, all within distributed systems
- You're genuinely excited about data infrastructure and operations with a familiarity working in cloud environments
- Professional skills
- You're resilient and flexible in ambiguous situations and enjoy solving problems from technical and business perspectives
- An interest in coaching, sharing your experience and knowledge with teammates
- You enjoy influencing others and always advocate for technical excellence while being open to change when needed
- Presence in the external tech community: you willingly share your expertise with others via speaking engagements, contributions to open source, blogs and more
- Research and develop statistical learning models for data analysis
- Collaborate with product management and engineering departments to understand company needs and devise possible solutions
- Keep up-to-date with latest technology trends
- Communicate results and ideas to key decision makers
- Implement new statistical or other mathematical methodologies as needed for specific models or analysis
- Optimize joint development efforts through appropriate database use and project design
Qualifications/Requirements:
- Masters or PhD in Computer Science, Electrical Engineering, Statistics, Applied Math or equivalent fields with strong mathematical background
- Excellent understanding of machine learning techniques and algorithms, including clustering, anomaly detection, optimization, neural network etc
- 3+ years experiences building data science-driven solutions including data collection, feature selection, model training, post-deployment validation
- Strong hands-on coding skills (preferably in Python) processing large-scale data set and developing machine learning models
- Familiar with one or more machine learning or statistical modeling tools such as Numpy, ScikitLearn, MLlib, Tensorflow
- Good team worker with excellent communication skills written, verbal and presentation
Desired Experience:
- Experience with AWS, S3, Flink, Spark, Kafka, Elastic Search
- Knowledge and experience with NLP technology
- Previous work in a start-up environment
- Deep knowledge of ( List here the mobile platforms on which the app runs,
e.g., Android, iOS, etc. )
- Developing new features and user interfaces from wireframe models
- Ensuring the best performance and user experience of the application
- Fixing bugs and performance problems
- Writing clean, readable, and testable code
- Cooperating with back-end developers, designers, and the rest of the team
to deliver well-architected and high-quality solutions










