


About Us
Zupee is India’s fastest-growing innovator in real money gaming with a focus on predominant skill-focused games. Started by 2 IIT-Kanpur alumni in 2018, we are backed by marquee global investors such as WestCap Group, Tomales Bay Capital, Matrix Partners, Falcon Edge, Orios Ventures, and Smile Group.
Know more about our recent funding: https://bit.ly/3AHmSL3
Our focus has been on innovating in the board, strategy, and casual games sub-genres. We innovate to ensure our games provide an intersection between skill and entertainment, enabling our users to earn while they play.
Location: We are location agnostic & our teams work from anywhere. Physically we are based out of Gurugram
Core Responsibilities:
- Responsible for designing, developing, testing, and deploying ML models that can leverage multiple signal sources, to build a personalized user experience
- Work closely with the business stakeholders to identify the potential Data science applications
- Contribute by doing opportunity analysis, building project proposals, designing and implementation of ML projects, in the areas of Ranking, Embeddings, Recommendation engines, etc
- Collaborate with software engineering teams to design experiments, model implementations, and new feature creation
- Clearly communicate technical details, strategies, and outcomes to a business audience
What are we looking for
- 4+ years of hands-on experience in data science using techniques including but not limited to regression, classification, NLP etc.
- Previous experience in model deployment, model monitoring, optimization and model interpretability.
- Expertise with Random forests, Gradient boosting, KNN, Regression and unsupervised learning algorithms
- Experience in using Neural Networks like ANN, RNN, Reinforcement Learning or Deep Learning etc.
- Solid understanding of Python and common machine learning frameworks such as XGBoost, scikit-learn, PyTorch/Tensorflow
- Outstanding problem-solving skills, with demonstrated ability to think creatively and strategically
- Technology-driven mindset, up-to-date with digital and technology literature, trends.
- Must have knowledge of Experimentation and Basic Statistics
- Must have experience in Predictive Analytics
Non-Desired Skills:
- Experience in Descriptive Analytics – Don’t apply
- Experience in NLP Text Analysis, Image, Speech Analysis – Don’t apply

About Zupee
About
Mission Zupee
We have made it our mission to improve people’s lives by improving their learning ability, skills and mental aptitude through scientifically designed Games. We have already helped over 10 million people improve their general knowledge, critical thinking and competitive skills. We are living in an age of competition and it is important to stay sharp and well informed. Zupee is committed to enhancing and enabling talent and skills for the workforce of the future by creating formats that educate, inform, entertain and reward the skills based behaviour of our users.
About us
Founded in 2018 by two IIT Kanpur graduates, We are backed by marquee investors and have raised USD 19Mn in series-A thus far. As India’s fastest growing skill-based Gaming company, we’re at the forefront of the industry and now holding a new vision to become the world’s leading gamification platform.
Our Growth Story
The online gaming industry is growing at an annual rate of 40% and is expected to become a 14 billion USD market in India by 2025. Zupee’s growth has galloped past the industry and become profitable at the EBT level in record time with more than 10Mn user base & consumer spend of 2Bn USD on our platform.
Follow links below to know more about our recent Forbes Asia 30under30 & fundraise coverages:
2. https://www.vccircle.com/matrix-partners-india-leads-series-a-funding-round-in-gaming-app-zupee
3. https://www.vccircle.com/exclusive-smile-group-backs-gaming-startup-zupee
4. https://www.forbes.com/profile/zupee/?list=30under30-asia-consumer-technology&sh=1c4e8e5a1cb0
Our Culture
Zupee thrives on a culture of entrepreneurship (ownership & hustle), customer obsession, bias for diagnosis & action and trust & respect for everyone. We keep our mission first in order to win customer’s trust above everything.
Photos
Connect with the team
Similar jobs
- An Engineering/Master’s Degree from a reputed institute
- 5+ year of experience in the B2B/SaaS/Enterprise domain. Great to have someone from the “Conversational AI” space
- Ability to engage and influence key stakeholders in accounts; Effective in performing technical/functional activities required during pre-sales stage
- Experience in Solution/Concept selling (CRM/ ERP/ Enterprise/ Contact Centers/ SaaS/ CPaaS Solutions) is a must
- Should be comfortable in understanding and explaining technology and solutions with a consultative approach
- Should have the ability to work independently and as part of a team
- Strong problem-solving and decision-making skills
- Strong problem-solving and decision-making skills
- Candidates from Edtech/HR Tech companies will not be considered
- Candidates with more than 2 job changes in the last 3 years will not be considered

Job description
Title-Full stack Developers
Job type-Full Time
Skills - Java , Spring Boot , SQL Server , PostgreSQL , MangoDB , Angular Js , Type Script , Microservice Architecture , Kafka , GIT , Git Flow Development , AWS , Azure , APIs , Web Services , CI/CD Pipeline , Agile/SCRUM , DataDog
Location- Hyderabad - Telangana
Experience- 6 to 9 Years
Annual CTC INR : 23 LPA - 28 LPA
Dead Line - 07/04/2025
Job Description
Position: Fullstack Developer
Location: Hyderabad, India
Employment Type: Full-Time
Open Positions: 7
Role Overview:
We are seeking experienced Fullstack Developers with 6-9 years of expertise in designing, developing, and maintaining scalable, distributed applications. The ideal candidate will be proficient in Java, Spring Boot, AngularJS, and TypeScript, with strong experience in cloud-based development, CI/CD pipelines, and Agile methodologies.
Key Responsibilities:
- Troubleshoot and resolve complex data, system, and software issues in production.
- Develop emergency bug fixes and manage production applications.
- Ensure production issues are resolved within SLA timelines.
- Deploy application changes using CI/CD pipelines.
- Review and manage production changes using ServiceNow.
- Lead scrum teams in Agile environments, ensuring high-quality technology solutions.
- Develop and enhance application frameworks with a focus on performance and scalability.
- Implement unit tests, container build checks, and API tests to support shift-left practices.
- Evaluate new platforms, tools, and technologies to optimize development workflows.
- Provide technical guidance, code reviews, and mentorship to team members.
Required Technical Skills:
- Strong experience in Java and Spring Boot application development.
- Proficiency in RDBMS (SQL Server/PostgreSQL) and NoSQL (MongoDB/ElasticSearch).
- Hands-on experience with AngularJS, TypeScript, and event-driven architecture.
- Solid understanding of messaging queues like Kafka.
- Expertise in Git and Git flow for code lifecycle management.
- Cloud experience with AWS or Azure, including API and web service development.
- Hands-on experience with CI/CD deployment pipelines and DevOps tools.
- Familiarity with monitoring tools such as Datadog, Dynatrace.
Nice to Have:
- Experience with Azure DevOps, SonarQube, and monitoring tools like StatsD.
- Test automation expertise.
Soft Skills & Leadership:
- Excellent problem-solving and analytical abilities.
- Strong communication and stakeholder management skills.
- Ability to lead Agile teams and drive best development practices.
Additional Requirements:
- Must be available to join within 3 weeks.
- Willing to attend face-to-face interviews as per company requirements.
- Open to relocating to Hyderabad if not already based there.

Job Description: Data Engineer
Position Overview:
Role Overview
We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.
Key Responsibilities
· Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.
· Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).
· Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.
· Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.
· Ensure data quality and consistency by implementing validation and governance practices.
· Work on data security best practices in compliance with organizational policies and regulations.
· Automate repetitive data engineering tasks using Python scripts and frameworks.
· Leverage CI/CD pipelines for deployment of data workflows on AWS.
Required Skills and Qualifications
· Professional Experience: 5+ years of experience in data engineering or a related field.
· Programming: Strong proficiency in Python, with experience in libraries like pandas, pySpark, or boto3.
· AWS Expertise: Hands-on experience with core AWS services for data engineering, such as:
· AWS Glue for ETL/ELT.
· S3 for storage.
· Redshift or Athena for data warehousing and querying.
· Lambda for serverless compute.
· Kinesis or SNS/SQS for data streaming.
· IAM Roles for security.
· Databases: Proficiency in SQL and experience with relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., DynamoDB) databases.
· Data Processing: Knowledge of big data frameworks (e.g., Hadoop, Spark) is a plus.
· DevOps: Familiarity with CI/CD pipelines and tools like Jenkins, Git, and CodePipeline.
· Version Control: Proficient with Git-based workflows.
· Problem Solving: Excellent analytical and debugging skills.
Optional Skills
· Knowledge of data modeling and data warehouse design principles.
· Experience with data visualization tools (e.g., Tableau, Power BI).
· Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).
· Exposure to other programming languages like Scala or Java.
Education
· Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
Why Join Us?
· Opportunity to work on cutting-edge AWS technologies.
· Collaborative and innovative work environment.
InViz is Bangalore Based Startup helping Enterprises simplifying the Search and Discovery experiences for both their end customers as well as their internal users. We use state-of-the-art technologies in Computer Vision, Natural Language Processing, Text Mining, and other ML techniques to extract information/concepts from data of different formats- text, images, videos and make them easily discoverable through simple human-friendly touchpoints.
TSDE - Data
Data Engineer:
- Should have total 3-6 Yrs of experience in Data Engineering.
- Person should have experience in coding data pipeline on GCP.
- Prior experience on Hadoop systems is ideal as candidate may not have total GCP experience.
- Strong on programming languages like Scala, Python, Java.
- Good understanding of various data storage formats and it’s advantages.
- Should have exposure on GCP tools to develop end to end data pipeline for various scenarios (including ingesting data from traditional data bases as well as integration of API based data sources).
- Should have Business mindset to understand data and how it will be used for BI and Analytics purposes.
- Data Engineer Certification preferred
Experience in Working with GCP tools like |
|
Store : CloudSQL , Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore |
|
Ingest : Stackdriver, Pub/Sub, AppEngine, Kubernete Engine, Kafka, DataPrep , Micro services |
|
Schedule : Cloud Composer |
|
Processing: Cloud Dataproc, Cloud Dataflow, Cloud Dataprep |
|
CI/CD - Bitbucket+Jenkinjs / Gitlab |
|
Atlassian Suite |
|
|
.

• Enhancing existing UI/UX.
• Build reusable code and libraries for future use
• Ensure the technical feasibility of UI/UX designs
• Optimize application for maximum speed and scalability
• Assure that all user input is validated before submitting to back-end
• Collaborate with other team members.
• Experience with responsive and adaptive designs.

Zorro is a new-age pseudonymous social network.
Zorro has recently raised $3.2Mn seed fund and backed by prominent investors, including Vijay Shekhar Sharma, Ritesh Agarwal, Kunal Shah, and 13 other Unicorn founders along with institutional investors including 3one4 capital, 9 Unicorn Ventures, Eximius Ventures & Roots Venture's
We at Zorro are looking for an Android Developer.
Role & Responsibilities :
- Working as an individual contributor on their respective development tasks.
- Have a strong sense of ownership and work with the team to understand requirements, refine them based on technical feasibility and drive the team to deliver them.
- Collaborate with other teams, to ensure timely and good quality delivery of features in an agile development environment.
-Ensuring the team follows processes and guidelines laid down for development.
- Draw on experience to lead technical designs and implementation of highly scalable and adaptable systems.
Requirements :
- 2-3 years of programming experience on the Android native platform.
- Hands-on experience with programming languages Kotlin and Java.
- Solid experience in MVVM architecture, along with knowledge of livedata, view-models.
- Having good experience in Coroutines, multithreading, Rx Java, retrofit.
- Experience with RESTful APIs to connect Android applications to back-end services.
- Strong coding, data structures and algorithms.
- Experience on various latest libraries available for Android.
- Deep understanding of software engineering practices, Object-Oriented Analysis.
- Obsession with quality and customer experience.
- Good to have knowledge of backend REST API and JSON
What you can expect from us :
- Learning Budget: If there's a workshop, book, or event you think will help you learn, we'll cover your bill.
- Health insurance for you and your family.
What happens after you apply?
Step 1: Within 7 days of your application our wholesome, original & expressive - our People Team will reach out to you for a quick chat.
Step 2: Within 4-6 days of chatting with the People Team, you will get a call from someone from your future team to discuss the job role.
Step 3: If all goes well, we'll schedule a call with your future manager to deep dive into the role with you and for you to show off your skills through a small task.
Step 4: After a quick interaction with the People Team, If our vibes match, a quick call with our Founders.
If we mutually enjoy the 4 steps, we onboard you with a big smile :)
NOTE :
At any step, if things don't work out, we will proactively send out an email. You are welcome to ask for detailed feedback and re-apply in the future.
We prefer [Passion>Skills>Education]
For more details visit joinzorro.com
- Gathering project requirements from customers and supporting their requests.
- Creating project estimates and scoping the solution based on clients’ requirements.
- Delivery on key project milestones in line with project Plan/ Budget.
- Establishing individual project plans and working with the team in prioritizing production schedules.
- Communication of milestones with the team and to clients via scheduled work-in-progress meetings
- Designing and documenting product requirements.
- Possess good analytical skills - detail-orientemd
- Be familiar with Microsoft applications and working knowledge of MS Excel
- Knowledge of MIS Reports & Dashboards
- Maintaining strong customer relationships with a positive, can-do attitude




Work shift: Day time
- Strong problem-solving skills with an emphasis on product development.
insights from large data sets.
• Experience in building ML pipelines with Apache Spark, Python
• Proficiency in implementing end to end Data Science Life cycle
• Experience in Model fine-tuning and advanced grid search techniques
• Experience working with and creating data architectures.
• Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural
networks, etc.) and their real-world advantages/drawbacks.
• Knowledge of advanced statistical techniques and concepts (regression, properties of distributions,
statistical tests and proper usage, etc.) and experience with applications.
• Excellent written and verbal communication skills for coordinating across teams.
• A drive to learn and master new technologies and techniques.
• Assess the effectiveness and accuracy of new data sources and data gathering techniques.
• Develop custom data models and algorithms to apply to data sets.
• Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes.
• Develop company A/B testing framework and test model quality.
• Coordinate with different functional teams to implement models and monitor outcomes.
• Develop processes and tools to monitor and analyze model performance and data accuracy.
Key skills:
● Strong knowledge in Data Science pipelines with Python
● Object-oriented programming
● A/B testing framework and model fine-tuning
● Proficiency in using sci-kit, NumPy, and pandas package in python
Nice to have:
● Ability to work with containerized solutions: Docker/Compose/Swarm/Kubernetes
● Unit testing, Test-driven development practice
● DevOps, Continuous integration/ continuous deployment experience
● Agile development environment experience, familiarity with SCRUM
● Deep learning knowledge
To generate the lead
To create dealers, distributors network
To handle channel sales
To visit retail outlets
To collect orders
To generate sales and revenue
To achieve sales target

