
As Conviva is expanding, we are building products providing deep insights into end-user experience for our customers.
Platform and TLB Team
The vision for the TLB team is to build data processing software that works on terabytes of streaming data in real-time. Engineer the next-gen Spark-like system for in-memory computation of large time-series datasets – both Spark-like backend infra and library-based programming model. Build a horizontally and vertically scalable system that analyses trillions of events per day within sub-second latencies. Utilize the latest and greatest big data technologies to build solutions for use cases across multiple verticals. Lead technology innovation and advancement that will have a big business impact for years to come. Be part of a worldwide team building software using the latest technologies and the best of software development tools and processes.
What You’ll Do
This is an individual contributor position. Expectations will be on the below lines:
- Design, build and maintain the stream processing, and time-series analysis system which is at the heart of Conviva’s products
- Responsible for the architecture of the Conviva platform
- Build features, enhancements, new services, and bug fixing in Scala and Java on a Jenkins-based pipeline to be deployed as Docker containers on Kubernetes
- Own the entire lifecycle of your microservice including early specs, design, technology choice, development, unit-testing, integration-testing, documentation, deployment, troubleshooting, enhancements, etc.
- Lead a team to develop a feature or parts of a product
- Adhere to the Agile model of software development to plan, estimate, and ship per business priority
What you need to succeed
- 5+ years of work experience in software development of data processing products.
- Engineering degree in software or equivalent from a premier institute.
- Excellent knowledge of fundamentals of Computer Science like algorithms and data structures. Hands-on with functional programming and know-how of its concepts
- Excellent programming and debugging skills on the JVM. Proficient in writing code in Scala/Java/Rust/Haskell/Erlang that is reliable, maintainable, secure, and performant
- Experience with big data technologies like Spark, Flink, Kafka, Druid, HDFS, etc.
- Deep understanding of distributed systems concepts and scalability challenges including multi-threading, concurrency, sharding, partitioning, etc.
- Experience/knowledge of Akka/Lagom framework and/or stream processing technologies like RxJava or Project Reactor will be a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
- Excellent communication skills. Willingness to work under pressure. Hunger to learn and succeed. Comfortable with ambiguity. Comfortable with complexity
Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their business ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include: DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision and Warner Bros Discovery.
Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets!

Similar jobs
Job Title : Lead Database Engineer
Location : Gurgaon Sector-43
Experience Required : 4+ Years
Employment Type : Full-Time
Summary :
We are seeking a highly skilled Lead Database Engineer with expertise in managing and optimizing database systems, primarily focusing on Amazon Aurora PostgreSQL, MySQL, and NoSQL databases. The ideal candidate will have in-depth knowledge of AWS services, database architecture, performance tuning, and security practices.
Key Responsibilities :
1. Database Administration :
- Manage and administer Amazon Aurora PostgreSQL, MySQL, and NoSQL database systems to ensure high availability, performance, and security.
- Implement robust backup and recovery procedures to maintain data integrity.
2. Optimization and Performance:
- Develop and execute optimization strategies at the database, query, collection, and table levels.
- Proactively monitor performance and fine-tune RDS parameter groups for optimal database operations.
- Conduct root cause analysis and resolve complex database performance issues.
3. AWS Services and Architecture :
- Leverage AWS services such as RDS, Aurora, and DMS to ensure seamless database operations.
- Perform database version upgrades for PostgreSQL and MySQL, integrating new features and performance enhancements.
4. Replication and Scalability:
- Implement and manage various replication strategies, including master-master and master-slave replication, ensuring data consistency and scalability.
5. Security and Access Control:
- Manage user permissions and roles, maintaining strict security protocols and access controls.
6. Collaboration:
- Work closely with development teams to optimize database design and queries, aligning database performance with application requirements.
Required Skills :
- Strong Expertise: Amazon Aurora PostgreSQL, MySQL, and NoSQL databases.
- AWS Services: Experience with RDS, Aurora, and DMS.
- Optimization: Hands-on experience in query optimization, database tuning, and performance monitoring.
- Replication Strategies: Knowledge of master-master and master-slave replication setups.
- Problem Solving: Proven ability to troubleshoot and resolve complex database issues, including root cause analysis.
- Security: Strong understanding of data security and access control practices.
- Collaboration: Ability to work with cross-functional teams and provide database-related guidance.
Preferred Qualifications :
- Certification in AWS or database management tools.
- Experience with other NoSQL databases like MongoDB or Cassandra.
- Familiarity with Agile and DevOps methodologies.

Performance Marketing Executive (Meta, Google & Shopify | Luxury D2C Brands)
📍 Near Khan Market, New Delhi (On-site)
🏢 The Brand Concierge
Role Overview
We are looking for a performance-driven marketing professional with hands-on experience in Meta Ads, Google Ads, and Shopify for luxury D2C brands across jewellery, home décor, lifestyle, and fashion. This role is ideal for someone with 2–4 years of experience in paid media and growth marketing, with a strong understanding of scaling premium brands through data-led strategies and conversion optimization.
Key Responsibilities
• Plan, execute, and optimize paid campaigns across Meta Ads (Facebook & Instagram) and Google Ads (Search, Display, YouTube)
• Manage end-to-end campaign setup including targeting, budgeting, creatives, and A/B testing
• Drive acquisition, conversions, and ROAS for Shopify-based D2C brands
• Optimize Shopify funnels (landing pages, product pages, checkout) for higher conversion rates
• Track and analyze performance (CPC, CPA, CTR, ROAS) with data-driven optimizations
• Set up and manage Meta Pixel, Google Analytics, and conversion tracking
• Build and scale retargeting and audience segmentation strategies
• Collaborate with creative teams to deliver high-performing, brand-aligned ad creatives
• Monitor trends, platform updates, and new ad formats across Meta and Google
Good to Have
• Experience working with luxury, fashion, lifestyle, or home décor brands
• Understanding of premium consumer behavior and purchase journeys
• Experience with Shopify plugins, landing page tools, or CRO tools
• Exposure to influencer-led performance campaigns and UGC-driven ads
• Agency experience handling multiple clients and budgets
Why Join The Brand Concierge?
• Work with premium and luxury D2C brands across fashion, lifestyle, beauty, and home
• Be part of a fast-growing, creative, and performance-driven agency
• Hands-on ownership of high-impact campaigns across Meta, Google, and Shopify
• Collaborative environment with design, content, and strategy teams
• Opportunity to grow into a senior performance marketing role
About The Brand Concierge
The Brand Concierge is a marketing agency specializing in customized, strategy-led solutions for premium and luxury brands. We work across fashion, lifestyle, beauty, home, and legacy segments to drive measurable, performance-led growth.
Our core services include performance marketing, social media marketing, influencer marketing, and branding—helping brands build a distinct identity and scale sustainably in the digital landscape.
Job Title: Chief Technology Officer (CTO)
Location: Trivandrum, Kerala
Employment Type: Full-Time
Experience: up to 7 years
Role Overview
We are seeking a highly skilled and forward-thinking Chief Technology Officer (CTO) with approximately 7 years of progressive experience in technology leadership. The candidate will be responsible for driving the company’s technological vision, leading product development, and ensuring scalable, secure, and innovative solutions aligned with business objectives.
Key Responsibilities
• Define and execute the company’s technology strategy and roadmap
• Lead software development, engineering, and IT operations teams
• Oversee architecture design, system scalability, and performance optimization
• Drive digital transformation and innovation initiatives
• Ensure adherence to security standards, data protection, and compliance requirements
• Manage cloud infrastructure, DevOps practices, and system integrations
• Collaborate with product and business teams for technology-driven growth
• Evaluate and implement emerging technologies to enhance business capabilities
• Establish best practices in coding, testing, deployment, and maintenance
• Manage vendor relationships and external technology partners
Required Qualifications
• B.Tech / BE / M.Tech / MCA or equivalent in Computer Science, IT, or related field
• Minimum 7 years of relevant experience in software development and technology leadership roles
• Proven experience in building and scaling technology platforms
• Strong knowledge of modern tech stacks, cloud platforms (AWS, Azure, GCP), and system architecture
• Experience with Agile / Scrum methodologies
Key Skills & Competencies
• Technology Strategy & Leadership
• System Architecture & Design
• Cloud Computing & DevOps
• Cybersecurity & Data Privacy
• Product Development & Lifecycle Management
• Problem-Solving & Analytical Thinking
• Stakeholder & Team Management
• Strong Communication & Decision-Making Skills
Regards,
Radhika Sharma
HR Manager Estabizz fintech private limited
- Strong Senior UI/UX Product Designer profile – IC Role in B2C Mobile Apps
- Mandatory (Experience 1): 4+ YOE as a Product Designer, UI/UX Designer, or Mobile App Designer with strong IC ownership in tech/startup environments
- Mandatory (Experience 2): Strong experience as a Fullstack Product Designer (UI + UX) for Mobile Apps (especially Android)
- Mandatory (Core Skill 1): Must be highly proficient in Figma - able to create wireframes, user flows, high-fidelity mockups, and micro-interactions from scratch
- Mandatory (Portfolio): Strong UI/UX portfolio of B2C Mobile Apps; Portfolio should reflect strong UI, Interaction, Visual Design skills (UI Heavy portfolio) (preferably launcher/utility/feed-driven apps)
- Mandatory (Company): B2C product companies
- Mandatory (Exclusion): Payment gateway companies
Preferred Companies
B2C Product Companies List: Ultrahuman, boAt, Noise, Fire-Boltt, Mamaearth, Nykaa, Purplle, SUGAR Cosmetics, mCaffeine, CRED, Paytm, PhonePe, Amazon India, Flipkart, Meesho, Tata 1mg, Practo, PharmEasy, HealthifyMe, Cult.fit, Groww, Zerodha, Upstox, Dream11, MPL (Mobile Premier League), WinZO, Ather Energy, Wakefit, SleepyCat, Urban Company, ShareChat, Moj, Dailyhunt, Koo, Chingari, Josh (VerSe Innovation), BYJU’S, Unacademy, Vedantu, Cuemath, Embibe
Role: Full-Time, Long-Term Required: Python, SQL Preferred: Experience with financial or crypto data
OVERVIEW
We are seeking a data engineer to join as a core member of our technical team. This is a long-term position for someone who wants to build robust, production-grade data infrastructure and grow with a small, focused team. You will own the data layer that feeds our machine learning pipeline—from ingestion and validation through transformation, storage, and delivery.
The ideal candidate is meticulous about data quality, thinks deeply about failure modes, and builds systems that run reliably without constant attention. You understand that downstream ML models are only as good as the data they consume.
CORE TECHNICAL REQUIREMENTS
Python (Required): Professional-level proficiency. You write clean, maintainable code for data pipelines—not throwaway scripts. Comfortable with Pandas, NumPy, and their performance characteristics. You know when to use Python versus push computation to the database.
SQL (Required): Advanced SQL skills. Complex queries, query optimization, schema design, execution plans. PostgreSQL experience strongly preferred. You think about indexing, partitioning, and query performance as second nature.
Data Pipeline Design (Required): You build pipelines that handle real-world messiness gracefully. You understand idempotency, exactly-once semantics, backfill strategies, and incremental versus full recomputation tradeoffs. You design for failure—what happens when an upstream source is late, returns malformed data, or goes down entirely. Experience with workflow orchestration required: Airflow, Prefect, Dagster, or similar.
Data Quality (Required): You treat data quality as a first-class concern. You implement validation checks, anomaly detection, and monitoring. You know the difference between data that is missing versus data that should not exist. You build systems that catch problems before they propagate downstream.
WHAT YOU WILL BUILD
Data Ingestion: Pipelines pulling from diverse sources—crypto exchanges, traditional market feeds, on-chain data, alternative data. Handling rate limits, API quirks, authentication, and source-specific idiosyncrasies.
Data Validation: Checks ensuring completeness, consistency, and correctness. Schema validation, range checks, freshness monitoring, cross-source reconciliation.
Transformation Layer: Converting raw data into clean, analysis-ready formats. Time series alignment, handling different frequencies and timezones, managing gaps.
Storage and Access: Schema design optimized for both write patterns (ingestion) and read patterns (ML training, feature computation). Data lifecycle and retention management.
Monitoring and Alerting: Observability into pipeline health. Knowing when something breaks before it affects downstream systems.
DOMAIN EXPERIENCE
Preference for candidates with experience in financial or crypto data—understanding market data conventions, exchange-specific quirks, and point-in-time correctness. You know why look-ahead bias is dangerous and how to prevent it.
Time series data at scale—hundreds of symbols with years of history, multiple frequencies, derived features. You understand temporal joins, windowed computations, and time-aligned data challenges.
High-dimensional feature stores—we work with hundreds of thousands of derived features. Experience managing, versioning, and serving large feature sets is valuable.
ENGINEERING STANDARDS
Reliability: Pipelines run unattended. Failures are graceful with clear errors, not silent corruption. Recovery is straightforward.
Reproducibility: Same inputs and code version produce identical outputs. You version schemas, track lineage, and can reconstruct historical states.
Documentation: Schemas, data dictionaries, pipeline dependencies, operational runbooks. Others can understand and maintain your systems.
Testing: You write tests for pipelines—validation logic, transformation correctness, edge cases. Untested pipelines are broken pipelines waiting to happen.
TECHNICAL ENVIRONMENT
PostgreSQL, Python, workflow orchestration (flexible on tool), cloud infrastructure (GCP preferred but flexible), Git.
WHAT WE ARE LOOKING FOR
Attention to Detail: You notice when something is slightly off and investigate rather than ignore.
Defensive Thinking: You assume sources will send bad data, APIs will fail, schemas will change. You build accordingly.
Self-Direction: You identify problems, propose solutions, and execute without waiting to be told.
Long-Term Orientation: You build systems you will maintain for years.
Communication: You document clearly, explain data issues to non-engineers, and surface problems early.
EDUCATION
University degree in a quantitative/technical field preferred: Computer Science, Mathematics, Statistics, Engineering. Equivalent demonstrated expertise also considered.
TO APPLY
Include: (1) CV/resume, (2) Brief description of a data pipeline you built and maintained, (3) Links to relevant work if available, (4) Availability and timezone.
Hi All,
We have a Great opportunity for Cyient for the skillset of SAP S/4HANA & BTP Developer
Please find the below attachment for your reference.
Job title:
SAP S/4HANA & BTP Developer
Work Location:
Any location (BAN, COB, HYD & PUN)
Experience range: (Min to Max years)
3 to 6 yrs
Mandatory / Preferred Skills:
Technical Skills
SAP S4/HANA ABAP & BTP Developer
· Good understanding of SAP CAP, including CDS modelling concepts and best practices.
· Relevant experience in SAP BTP and experience in implementing BTP Extension Suite for extending SAP applications.
· Solid experience in backend development using Node.js for creating CAP microservices.
· Knowledge of unit testing frameworks such as Chai, Mocha, or equivalent, and experience in writing unit test cases for backend services.
· Good understanding of SAP CAP, including CDS modelling concepts and best practices.
· Relevant experience in SAP BTP and experience in implementing BTP Extension Suite for extending SAP applications.
· Solid experience in backend development using Node.js for creating CAP microservices.
· Knowledge of unit testing frameworks such as Chai, Mocha, or equivalent, and experience in writing unit test cases for backend services.
General:
· Strong debugging and troubleshooting skills to identify and resolve issues in CAP applications, BTP extensions, and backend services.
· Strong collaboration and communication skills to work effectively with cross-functional teams and stakeholders.
· Ability to work independently and deliver high-quality results in a fast-paced environment.
· Strong documentation skills to create technical documentation for future reference and knowledge sharing.
· Proactive attitude towards learning and staying updated with the latest trends and technologies in SAP CAP, BTP, Node.js, and unit testing frameworks.
· Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field (4 years+ Degree in Education)
We have urgent requirement for Collibra Expert in Persistem System Ltd
Job Description:
Expert in Collibra for implementation of Governance workflow, data cataloguing, data lineage, integrations, data assets etc.
Well versed with Message events
Understand Data modelling and have good knowledge in other areas of data governance.
Setting up Collibra on cloud (preferred) and on onprem
Must have understanding on defining data policy and standard
Collibra certified
Strong Knowledge on Databases
Knowledge of Insurance/Claims domain would be good to have
• 3+ years of experience building mobile applications.
• Proficient with React Native.
• Experience in developing and delivering large scale Android and iOS mobile applications
via React-Native.
• Ability to work through new and complex React Native issues and contribute to libraries
as needed.
• Engineers who are obsessed with optimizations and are ready to go the extra mile to
deliver the best app experience to the customers.
• Firm grasp of the JavaScript language and its nuances, including ES6+ syntax and
Typescript.
• Experience in writing unit testing code with libraries like Jest, Enzyme, Jasmine, Mocha,
etc.
• Experience in at least one native language, Android or iOS(Swift/Objective C).
• Good knowledge of monitoring and tracking down app crashes and bugs using advanced
tools.
• Worked in a startup environment with high levels of ownership and commitment.
• A growth mindset and passionate about building things from the ground up, and most
importantly, you should be fun to work with
Python API DeveloperJD:
Experience: 4-6 Yrs
Notice Period: 10-20 days or within 1 month
>> Develop and maintain various security software products with queues, caching & database management.
>> Hands-on experience in Coding in Python is required along with Knowledge about Data Structures and object-oriented programming, Algorithms.
>> Extensive experience in developing asynchronous systems.
>> Integration of user-facing elements developed by front-end developers with server-side logic
>> Implementation of security and data protection.
>> Performance tuning, improvement, balancing, usability, automation
Mandatory Skills:
- Python
- Flask/Django
- API
TM1 Lead Engineer
Job Description:
The Lead Engineer should have good communication skills and is able to work with colleagues in Change delivery and Finance Users.
The Lead Engineer should have a very good understanding of Financial Statements.
As the Lead Engineer you might be working across different projects simultaneously 9+ Years TM1 Development experience Expertise in developing end to end solutions from ground up in TM1 Ability to design secure, scalable, configurable and modular solutions Expertise in applying optimisations both for performance and configuration Highly proficient in all aspects of TM1 development and Infrastructure
|| Expert in IBM Planning Analytics 2.0 Expert in development on PAW and PAX Highly proficient in developing TM1 Rest API based solutions Strong knowledge of MDX, SQL, Excel, Visual Basic, relational databases Must have good verbal communication skills
– Ability to work across geographies
A self-motivated, confident team player who leads by example and provides guidance to others DevOps and Agile engineering practitioner, test driven development Excellent communication skills & team player










