Sizzle is an exciting new startup that’s changing the world of gaming. At Sizzle, we’re building AI to automate gaming highlights, directly from Twitch and YouTube streams. We’re looking for a superstar engineer that is well versed with AI and audio technologies around audio detection, speech-to-text, interpretation, and sentiment analysis.
You will be responsible for:
Developing audio algorithms to detect key moments within popular online games, such as:
Streamer speaking, shouting, etc.
Gunfire, explosions, and other in-game audio events
Speech-to-text and sentiment analysis of the streamer’s narration
Leveraging baseline technologies such as TensorFlow and others -- and building models on top of them
Building neural network architectures for audio analysis as it pertains to popular games
Specifying exact requirements for training data sets, and working with analysts to create the data sets
Training final models, including techniques such as transfer learning, data augmentation, etc. to optimize models for use in a production environment
Working with back-end engineers to get all of the detection algorithms into production, to automate the highlight creation
You should have the following qualities:
Solid understanding of AI frameworks and algorithms, especially pertaining to audio analysis, speech-to-text, sentiment analysis, and natural language processing
Experience using Python, TensorFlow and other AI tools
Demonstrated understanding of various algorithms for audio analysis, such as CNNs, LSTM for natural language processing, and others
Nice to have: some familiarity with AI-based audio analysis including sentiment analysis
Familiarity with AWS environments
Excited about working in a fast-changing startup environment
Willingness to learn rapidly on the job, try different things, and deliver results
Ideally a gamer or someone interested in watching gaming content online
Skills:
Machine Learning, Audio Analysis, Sentiment Analysis, Speech-To-Text, Natural Language Processing, Neural Networks, TensorFlow, OpenCV, AWS, Python
Work Experience: 2 years to 10 years
About Sizzle
Sizzle is building AI to automate gaming highlights, directly from Twitch and YouTube videos. Presently, there are over 700 million fans around the world that watch gaming videos on Twitch and YouTube. Sizzle is creating a new highlights experience for these fans, so they can catch up on their favorite streamers and esports leagues. Sizzle is available at http://www.sizzle.gg">www.sizzle.gg.

Similar jobs

Job Description
We are seeking a skilled Microsoft Dynamics 365 Developer with 4–7 years of hands-on experience in designing, customizing, and developing solutions within the Dynamics 365 ecosystem. The ideal candidate should have strong technical expertise, solid understanding of CRM concepts, and experience integrating Dynamics 365 with external systems.
Key Responsibilities
- Design, develop, and customize solutions within Microsoft Dynamics 365 CE.
- Work on entity schema, relationships, form customizations, and business logic components.
- Develop custom plugins, workflow activities, and automation.
- Build and enhance integrations using APIs, Postman, and related tools.
- Implement and maintain security models across roles, privileges, and access levels.
- Troubleshoot issues, optimize performance, and support deployments.
- Collaborate with cross-functional teams and communicate effectively with stakeholders.
- Participate in version control practices using GIT.
Must-Have Skills
Core Dynamics 365 Skills
- Dynamics Concepts (Schema, Relationships, Form Customization): Advanced
- Plugin Development: Advanced (writing and optimizing plugins, calling actions, updating related entities)
- Actions & Custom Workflows: Intermediate
- Security Model: Intermediate
- Integrations: Intermediate (API handling, Postman, error handling, authorization & authentication, DLL merging)
Coding & Versioning
- C# Coding Skills: Intermediate (Able to write logic using if-else, switch, loops, error handling)
- GIT: Basic
Communication
- Communication Skills: Intermediate (Ability to clearly explain technical concepts and work with business users)
Good-to-Have Skills (Any 3 or More)
Azure & Monitoring
- Azure Functions: Basic (development, debugging, deployment)
- Azure Application Insights: Intermediate (querying logs, pushing logs)
Reporting & Data
- Power BI: Basic (building basic reports)
- Data Migration: Basic (data import with lookups, awareness of migration tools)
Power Platform
- Canvas Apps: Basic (building basic apps using Power Automate connector)
- Power Automate: Intermediate (flows & automation)
- PCF (PowerApps Component Framework): Basic
Skills: Microsoft Dynamics, Javascript, Plugins
Must-Haves
Microsoft Dynamics 365 (4-7 years), Plugin Development (Advanced), C# (Intermediate), Integrations (Intermediate), GIT (Basic)
Core Dynamics 365 Skills
Dynamics Concepts (Schema, Relationships, Form Customization): Advanced
Plugin Development: Advanced (writing and optimizing plugins, calling actions, updating related entities)
Actions & Custom Workflows: Intermediate
Security Model: Intermediate
Integrations: Intermediate
(API handling, Postman, error handling, authorization & authentication, DLL merging)
Coding & Versioning
C# Coding Skills: Intermediate
(Able to write logic using if-else, switch, loops, error handling)
GIT: Basic
Notice period - Immediate to 15 days
Locations: Bangalore only
(Ability to clearly explain technical concepts and work with business users)
Nice to Haves
(Any 3 or More)
Azure & Monitoring
Azure Functions: Basic (development, debugging, deployment)
Azure Application Insights: Intermediate (querying logs, pushing logs)
Reporting & Data
Power BI: Basic (building basic reports)
Data Migration: Basic
(data import with lookups, awareness of migration tools)
Power Platform
Canvas Apps: Basic (building basic apps using Power Automate connector)
Power Automate: Intermediate (flows & automation)
PCF (PowerApps Component Framework): Basic
Review Criteria:
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Role & Responsibilities:
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Preferred:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
WHO YOU ARE
To be successful in this role, you’ll need to have the following skills:
- Love for coding: A fanatic about writing beautiful and scalable code.
- Sense of analytics: Strong analytical and troubleshooting skills. Should be resourceful, innovative and inventive.
- Dynamic: Should be comfortable in dealing with lots of moving pieces. Have exquisite attention to detail, and are comfortable learning new technologies and systems.
- Team player: Knack for influencing without being authoritative. Pitch in wherever the team needs help, from writing blog posts to supporting customers.
- Accountability: High sense of ownership for your code and relentlessness to deliver projects with high business impact.
KEY QUALIFICATIONS
- BE/BTech in Computer Science or related field.
- Minimum 3+ years of UI/Frontend development and a strong understanding of building complex layouts using JavaScript, CSS and HTML.
KEY SKILLS
- Strong computer system analysis and design skills in current methodologies and patterns including vanilla JavaScript, some experience with ReactJS, Redux, React Native, Webpack, and Typescript or similar library/tools.
- Obsessed with WPO (Web Performance Optimization) and web/mobile performance analysis.
- Experienced with web standards and protocols such as HTTP, DNS, TCP/IP, and socket APIs in general.
Role - Relationship & Wealth Manager
About Client:
Our client is leading Indian investment services company having wide array of products and services such as distribution of mutual funds & insurance, equity and derivatives, commodity, PMS and financial planning.
Roles and Responsibilities:
- Acquire new clients, generate revenue, and maintain relationships with High-Net-Worth Individuals (HNI) and Ultra High Net Worth Individuals (UHNI), Corporate Treasuries.
- Enhance and build on Existing Bank tie-ups.
- Offer comprehensive wealth management services.
- Enhance the relationship value of existing customers.
- Ensure increased wallet share to prevent devaluation of securities.
- Analyze client portfolios, assess risk appetite, and provide tailored advice on wealth products.
Location: Mumbai, Pune, Hyderabad, Bangalore, Delhi, Chennai, Coimbatore, Kerala, Kolkata, Nashik.
Qualification:
- MBA Finance, CA, CS or with relevant work experience in managing Investment Products. (2 -15 years of experience)
- Proficiency related to wealth products (e.g., Mutual funds, PMS's, AIF's, Structured Products, Direct Equities, Offshore investments, Risk protection management solutions).
- Build and maintain an extensive external network to stay connected with clients.
- Demonstrate a service-oriented attitude, delivering smooth and superior service.
- Ability to analyse client data, market trends, and stay updated on current market conditions.
- Proven track record in sales, particularly in the private banking or wealth management sector.
- Strong interpersonal and communication skills.
- Relevant financial certifications or qualifications (e.g., CFP, CFA) is a plus.
- Knowledge of regulatory compliance in the financial services industry.
- Excellent organizational and time-management skills
The Platform Data Science team works at the intersection of data science and engineering. Domain experts develop and advance platforms, including the data platforms, machine learning platform, other platforms for Forecasting, Experimentation, Anomaly Detection, Conversational AI, Underwriting of Risk, Portfolio Management, Fraud Detection & Prevention and many more. We also are the Data Science and Analytics partners for Product and provide Behavioural Science insights across Jupiter.
About the role:
We’re looking for strong Software Engineers that can combine EMR, Redshift, Hadoop, Spark, Kafka, Elastic Search, Tensorflow, Pytorch and other technologies to build the next generation Data Platform, ML Platform, Experimentation Platform. If this sounds interesting we’d love to hear from you!
This role will involve designing and developing software products that impact many areas of our business. The individual in this role will have responsibility help define requirements, create software designs, implement code to these specifications, provide thorough unit and integration testing, and support products while deployed and used by our stakeholders.
Key Responsibilities:
Participate, Own & Influence in architecting & designing of systems
Collaborate with other engineers, data scientists, product managers
Build intelligent systems that drive decisions
Build systems that enable us to perform experiments and iterate quickly
Build platforms that enable scientists to train, deploy and monitor models at scale
Build analytical systems that drives better decision making
Required Skills:
Programming experience with at least one modern language such as Java, Scala including object-oriented design
Experience in contributing to the architecture and design (architecture, design patterns, reliability and scaling) of new and current systems
Bachelor’s degree in Computer Science or related field
Computer Science fundamentals in object-oriented design
Computer Science fundamentals in data structures
Computer Science fundamentals in algorithm design, problem solving, and complexity analysis
Experience in databases, analytics, big data systems or business intelligence products:
Data lake, data warehouse, ETL, ML platform
Big data tech like: Hadoop, Apache Spark
Job : Full Stack Developer
Location:Hyderabad
Mode:Hybrid
Experience : 4 +
Skills: React Js, Node Js, Javascript, AWS
We are seeking a talented Full Stack Developer to join our dynamic team. The ideal candidate should
have a strong background in Node.js, React.js, AWS, and proficient skills in JavaScript. As a Full
Stack Developer, you will be responsible for designing, developing, and maintaining web applications
throughout the entire software development lifecycle. Your expertise will contribute to the creation of
innovative solutions that enhance user experience and drive business growth.
Responsibilities:
● Collaborate with cross-functional teams to define, design, and ship new features.
● Develop server-side logic using Node.js, ensuring high performance and responsiveness to
requests from front-end components.
● Build reusable and efficient front-end components using React.js.
● Implement and maintain API integrations with third-party services.
● Optimize applications for maximum speed and scalability.
● Collaborate with other team members and stakeholders to troubleshoot, debug, and optimize
application performance.
● Stay up-to-date with emerging technologies and industry trends to ensure the best practices
are consistently applied.
● Implement security and data protection measures.
● Participate in code reviews and provide constructive feedback to team members.
● Deploy applications on AWS and manage cloud infrastructure.
Description:
● 5+ years of overall experience with 3 years of exp in React.js,Node.js
● Bachelor's degree in Computer Science, Engineering, or a related field.
● Proven experience as a Full Stack Developer or similar role.
● Strong proficiency in JavaScript and its modern frameworks (Node.js, React.js).
● Experience with AWS services and cloud infrastructure.
● Familiarity with front-end technologies such as HTML, CSS, and JavaScript
frameworks/libraries.
● Recommended Experience: Proficiency in Node.js, Typescript, React, and MongoDB.
● Knowledge of database systems (SQL, NoSQL)
● Experience with user interface design.
● Knowledge of performance testing frameworks including Mocha and Jest.
● Experience with browser-based debugging and performance testing
● Excellent problem-solving and communication skills.
● Ability to work both independently and collaboratively in a team environment.
How to Apply:
If you have a passion for backend development, possess the required experience, and are excited
about contributing to innovative projects, we invite you to apply. Please submit your resume, cover
letter, and any relevant portfolio or project samples to Anusha Kalidindi
P99soft is an equal opportunity employer, dedicated to fostering diversity and creating an inclusive
workplace for all employees.

Job Summary:
We are seeking a motivated and enthusiastic Telecaller to join our admissions team. As a Telecaller, you
will be responsible for contacting customers, introducing our products or services, and generating leads.
Your primary objective will be to engage customers over the phone, provide information about our
offerings, address any inquiries or concerns, Excellent communication skills, persuasive abilities, and a
customer-oriented approach are essential for success in this role.
Responsibilities:
• Contact potential customers via telephone to introduce our programs. • Make outbound calls to clients and provide information about our offers. • Engage in active listening to understand Parent’s needs. • Answer customer inquiries, resolve complaints, and provide appropriate solutions. • Maintain accurate and up-to-date records of customer interactions and leads in the CRM system. • Follow up with customers to ensure satisfaction and foster long-term relationships. • Collaborate with the sales team to develop effective strategies and techniques. • Stay updated with product knowledge, market trends, and competitors' activities. • Participate in sales meetings, training sessions, and team-building activities.
Requirements:
• Proven experience as a Telecaller or similar sales role.
• Excellent verbal communication and interpersonal skills.
• Persuasive and confident with the ability to build rapport with customers.
• Active listening skills to understand customer needs and concerns.
• Ability to work in a target-driven environment and achieve goals.
• Strong organizational skills with the ability to multitask and prioritize effectively.
• Proficiency in using CRM software and other telecommunication tools.
• Ability to handle objections and resolve customer complaints professionally.
• High school diploma or equivalent; additional education or certifications in sales or customer
service is a plus.
We are hiring Magento Architect.
Experience - 10 - 15 years
Location - Ahmedabad, Gujarat.
5 Days working
Should have Good command in English.
Responsibilities:
- Minimum 5 to 7 years of experience in Magento Development
- Proficient in PHP/MySql
- Experience on Magento Enterprise & Magento 2.0
- Familiar with technologies including XML, web services, JQuery
- To develop high-end web applications
- To research and implement new technologies
- Working knowledge of shopping cart development with shipping, and Payment Gateway Integration for E-commerce websites
- Should have excellent database design and implementation skills
- Experience in API Integration with 3rd party ERP systems, 3rd party services
- Should have technical bent of mind
- Should have problem-solving, prioritizing tasks, multi-tasking abilities
- Should be able to understand and handle tasks independently
- Ability to work to deadlines & as a team member
- B.Tech in Computer Science/Information Technology or related field with 5-7 years of relevant experience
- Strong hand in Data Structures and Algorithms
- Proficiency in Java language, Collection & Maps, Exceptional Handling, Object-Oriented Concepts, Design Patterns, Multithreading, and File I/O
- Expertise in Mobile App Automation with Espresso, Appium and/or XCUITest is a must
- Experience in Android Activity life cycles, Android Versions- Builds and apks, App development
- Having solid understanding of tools - jUnit, TestNG, Git Commands, Jenkins Job Setup, Gradle, Maven, and CI/CD
- Candidate with knowledge of Swift, XCUITest, XCUI Tests Scripting, Test Framework Knowledge, Scheme Creation, Cloud Device Management, XCUI Performance and Profiling, XCUI Coverage, SonarCube and Xcode is preffered
- Decent experience in Server Automations, Rest Services, and Controllers, Server Testcases, Jar/war Deployment, SpringBoot, Backend architecture, MySQL/NoSQL, mongo, Redis, Zookeeper
Skill:-
- Candidate must be strong in logic and programming
- Perform all phases of software development life-cycle, including application design, programming, testing (unit and system level), internal documentation of code.
- Strong knowledge Native Android APIs
- Experience with internet technologies such as JSON, XML, HTTP, REST, AWS, MySQL
- Experience with analytics, crash reporting and other
- Core Data and integration with downstream REST APIs & services
- Familiarity with JavaScript frameworks like jQuery, AngularJS is plus
- Ability to work in a variety of client settings and in a team-oriented, collaborative environment
- Strong communication and client-facing skills with the ability to work in a consulting







