
Performance Marketing Specialist
Experience: 1–2 Years
Qualification: Minimum Undergraduate Degree (UG)
Certification: Good to have (Performance Marketing / Digital Marketing / Analytics)
Role Overview:
We are looking for a data-driven Performance Marketing Specialist to scale user
acquisition and drive business growth for Tamasha’s consumer-facing apps. Products: • Openly – A conversation-first social app focused on meaningful interactions. • Playroom – Voicechat – A real-time voice chat platform for live community
engagement. • FriendChat – A chatroom-based social app for discovering and connecting with
new people. Key Responsibilities: • Plan, execute, and optimize paid acquisition campaigns across Meta and Google
Ads. • Drive high-quality app installs with focus on CPI, CPA, ROAS, LTV, and retention. • Own daily budget allocation, bid optimization, targeting, and scaling decisions. • Perform deep analysis at campaign, ad set, ad, and creative levels. • Build daily, weekly, and monthly performance reports. • Analyze funnel metrics and identify drop-offs with actionable insights.
• Work closely with product and analytics teams for attribution accuracy. • Share insights and recommendations with stakeholders clearly. Required Skills: • Strong hands-on experience with Meta and Google Ads. • Data-driven mindset with strong analytical skills. • Experience in mobile app performance marketing. • Excellent communication and stakeholder management skills.

About Tamashalive
About
Gamepe Technologies creates innovative digital products that connect communities through live interactive contests, audio experiences, and meaningful connections.
Tech stack
Candid answers by the company
Bangalore
Similar jobs
Job Title : Senior SAP PPDS Consultant
Experience : 6+ Years
Location : Open to USI locations (Hyderabad / Bangalore / Mumbai / Pune / Chennai / Gurgaon)
Job Type : Full-Time
Start Date : Immediate Joiners Preferred
Job Description :
We are urgently seeking a Senior SAP PPDS (Production Planning and Detailed Scheduling) Consultant with strong implementation experience.
The ideal candidate will be responsible for leading and supporting end-to-end project delivery for SAP PPDS, contributing to solution design, configuration, testing, and deployment in both Greenfield and Brownfield environments.
Mandatory Skills : SAP PPDS, CIF Integration, Heuristics, Pegging Strategies, Production Scheduling, S/4 HANA or ECC, Greenfield/Brownfield Implementation.
Key Responsibilities :
- Lead the implementation of SAP PPDS modules including system configuration and integration with SAP ECC/S4 HANA.
- Collaborate with stakeholders to gather requirements and define functional specifications.
- Design, configure, and test SAP PPDS solutions to meet business needs.
- Provide support for system upgrades, patches, and enhancements.
- Participate in workshops, training sessions, and knowledge transfers.
- Troubleshoot and resolve issues during implementation and post-go-live.
- Ensure documentation of functional specifications, configuration, and user manuals.
Required Skills :
- Minimum 6+ Years of SAP PPDS experience.
- At least 1-2 Greenfield or Brownfield implementation projects.
- Strong understanding of supply chain planning and production scheduling.
- Hands-on experience in CIF integration, heuristics, optimization, and pegging strategies.
- Excellent communication and client interaction skills.
Preferred Qualifications :
- Experience in S/4 HANA environment.
- SAP PPDS Certification is a plus.
- Experience working in large-scale global projects.
Role & Responsibilities
About the Role:
We are seeking a highly skilled Senior Data Engineer with 5-7 years of experience to join our dynamic team. The ideal candidate will have a strong background in data engineering, with expertise in data warehouse architecture, data modeling, ETL processes, and building both batch and streaming pipelines. The candidate should also possess advanced proficiency in Spark, Databricks, Kafka, Python, SQL, and Change Data Capture (CDC) methodologies.
Key responsibilities:
Design, develop, and maintain robust data warehouse solutions to support the organization's analytical and reporting needs.
Implement efficient data modeling techniques to optimize performance and scalability of data systems.
Build and manage data lakehouse infrastructure, ensuring reliability, availability, and security of data assets.
Develop and maintain ETL pipelines to ingest, transform, and load data from various sources into the data warehouse and data lakehouse.
Utilize Spark and Databricks to process large-scale datasets efficiently and in real-time.
Implement Kafka for building real-time streaming pipelines and ensure data consistency and reliability.
Design and develop batch pipelines for scheduled data processing tasks.
Collaborate with cross-functional teams to gather requirements, understand data needs, and deliver effective data solutions.
Perform data analysis and troubleshooting to identify and resolve data quality issues and performance bottlenecks.
Stay updated with the latest technologies and industry trends in data engineering and contribute to continuous improvement initiatives.
** Certification in Mulesoft Mandatory
** Btech/BE /MCA only with 6+ exp in Mulesoft
** Ready to join within 0-15 days/ serving NP
SOA
REST
Web Services
Mule ESB
Anypoint Studio
API Management
CloudHub
Integration Patterns
Good Communication skills
#masstort specialist!
We're looking for an experienced professional to handle complex masstort litigation, manage multiple cases, and effectively communicate with clients. Your expertise will be crucial in navigating intricate legal issues and achieving successful outcomes. If you have a strong background in masstort law, excellent organizational skills, and a passion for justice, we want to hear from you. This is a fantastic opportunity to make a significant impact and grow your career in a dynamic and supportive environment.
As the Head of Engineering, you will build and interact with the entire engineering team to
solve problems and build the entire learning platform. This will involve working on building
the learning management system, content management systems and to build the online live class platform. This includes building the backend platform and frontend journeys for
customers.
.
Roles and responsibilities:
• Building the product architecture for performance and scalability of the end-to-end
product
• Recruiting, mentoring and training junior developers
• Working in close coordination with the founder to implement various business-related
objectives of the company.
• End to end delivery of new features
• Design, develop and own components of a highly scalable education platform
• Constantly strive to improve software development process and team productivity.
• Lead module development independently
• Working with Product Team/ Sales Team to make both the ends of business meet
• Ensuring best of design processes to ensure better user experience, higher
engagements, and root cause elimination of technical issues.
• Out of many tools and technologies, you must be excellent at Java, MongoDb /
MERN Stack & React Native to build Web Application & Mobile Application for the
organization
Required background, Knowledge, Skills and Abilities
• 5-8 years of experience in building products end-to-end; preferably at least 3 years
with building products from scratch and getting the 0-1 journey
• Tech Stack (Node.js, React Native, React.js, AWS, MongoDB)
• Ability to code and build the product on their own with assistance from junior
developers
• Strong knowledge of Performance optimization
• Strong Object-oriented Programming concepts, data structure and algorithms
• Experience in developing scalable, fault-tolerant, distributed backend services
• Experience with prevalent design patterns and advanced system designing
• Good experiences with databases and Schema design. Knowledge of NoSQL
databases
• Strong Problem-Solving skills and communication skills
• Frugal approach to problem solving and product building process
• B.E/B.Tech/M.Tech Computer Science or related field
Who can Apply:
• Are available for full time in office (Bangalore)
• Have relevant skills and interests
Other Rewards:
• Opportunity to step up and become a leader, to, have full-ownership of the work and
to create impactful global products
• A chance to work alongside the leadership team from Harvard and IIT
• Competitive market-standard compensation
Hands-on experience in React.js and Good understanding of core concepts
Hands-on experience in state management libraries like Redux.
Hands-on experience in HTML5 & CSS3 and JavaScript.
Experience integrating with Restful web services
Proficient with ES6/7/8 syntax and concepts
Proficient knowledge of cross-browser compatibility issues
Knowledge of modern authorization mechanisms, such as JSON Web Token
Experience with common front-end development tools such as Babel, Web pack, NPM.

- General Accounting
- Account Payable/Account Receivable
- Debit/credit, reconciliation
- Bank Receivable and Bank Payable.
- TDS, Tax rules
- Balance sheet
- Bank and other reconciliation's
Role and Responsibilities
- Execute data mining projects, training and deploying models over a typical duration of 2 -12 months.
- The ideal candidate should be able to innovate, analyze the customer requirement, develop a solution in the time box of the project plan, execute and deploy the solution.
- Integrate the data mining projects embedded data mining applications in the FogHorn platform (on Docker or Android).
Core Qualifications
Candidates must meet ALL of the following qualifications:
- Have analyzed, trained and deployed at least three data mining models in the past. If the candidate did not directly deploy their own models, they will have worked with others who have put their models into production. The models should have been validated as robust over at least an initial time period.
- Three years of industry work experience, developing data mining models which were deployed and used.
- Programming experience in Python is core using data mining related libraries like Scikit-Learn. Other relevant Python mining libraries include NumPy, SciPy and Pandas.
- Data mining algorithm experience in at least 3 algorithms across: prediction (statistical regression, neural nets, deep learning, decision trees, SVM, ensembles), clustering (k-means, DBSCAN or other) or Bayesian networks
Bonus Qualifications
Any of the following extra qualifications will make a candidate more competitive:
- Soft Skills
- Sets expectations, develops project plans and meets expectations.
- Experience adapting technical dialogue to the right level for the audience (i.e. executives) or specific jargon for a given vertical market and job function.
- Technical skills
- Commonly, candidates have a MS or Ph.D. in Computer Science, Math, Statistics or an engineering technical discipline. BS candidates with experience are considered.
- Have managed past models in production over their full life cycle until model replacement is needed. Have developed automated model refreshing on newer data. Have developed frameworks for model automation as a prototype for product.
- Training or experience in Deep Learning, such as TensorFlow, Keras, convolutional neural networks (CNN) or Long Short Term Memory (LSTM) neural network architectures. If you don’t have deep learning experience, we will train you on the job.
- Shrinking deep learning models, optimizing to speed up execution time of scoring or inference.
- OpenCV or other image processing tools or libraries
- Cloud computing: Google Cloud, Amazon AWS or Microsoft Azure. We have integration with Google Cloud and are working on other integrations.
- Decision trees like XGBoost or Random Forests is helpful.
- Complex Event Processing (CEP) or other streaming data as a data source for data mining analysis
- Time series algorithms from ARIMA to LSTM to Digital Signal Processing (DSP).
- Bayesian Networks (BN), a.k.a. Bayesian Belief Networks (BBN) or Graphical Belief Networks (GBN)
- Experience with PMML is of interest (see www.DMG.org).
- Vertical experience in Industrial Internet of Things (IoT) applications:
- Energy: Oil and Gas, Wind Turbines
- Manufacturing: Motors, chemical processes, tools, automotive
- Smart Cities: Elevators, cameras on population or cars, power grid
- Transportation: Cars, truck fleets, trains
About FogHorn Systems
FogHorn is a leading developer of “edge intelligence” software for industrial and commercial IoT application solutions. FogHorn’s Lightning software platform brings the power of advanced analytics and machine learning to the on-premise edge environment enabling a new class of applications for advanced monitoring and diagnostics, machine performance optimization, proactive maintenance and operational intelligence use cases. FogHorn’s technology is ideally suited for OEMs, systems integrators and end customers in manufacturing, power and water, oil and gas, renewable energy, mining, transportation, healthcare, retail, as well as Smart Grid, Smart City, Smart Building and connected vehicle applications.
Press: https://www.foghorn.io/press-room/">https://www.foghorn.io/press-room/
Awards: https://www.foghorn.io/awards-and-recognition/">https://www.foghorn.io/awards-and-recognition/
- 2019 Edge Computing Company of the Year – Compass Intelligence
- 2019 Internet of Things 50: 10 Coolest Industrial IoT Companies – CRN
- 2018 IoT Planforms Leadership Award & Edge Computing Excellence – IoT Evolution World Magazine
- 2018 10 Hot IoT Startups to Watch – Network World. (Gartner estimated 20 billion connected things in use worldwide by 2020)
- 2018 Winner in Artificial Intelligence and Machine Learning – Globe Awards
- 2018 Ten Edge Computing Vendors to Watch – ZDNet & 451 Research
- 2018 The 10 Most Innovative AI Solution Providers – Insights Success
- 2018 The AI 100 – CB Insights
- 2017 Cool Vendor in IoT Edge Computing – Gartner
- 2017 20 Most Promising AI Service Providers – CIO Review
Our Series A round was for $15 million. Our Series B round was for $30 million October 2017. Investors include: Saudi Aramco Energy Ventures, Intel Capital, GE, Dell, Bosch, Honeywell and The Hive.
About the Data Science Solutions team
In 2018, our Data Science Solutions team grew from 4 to 9. We are growing again from 11. We work on revenue generating projects for clients, such as predictive maintenance, time to failure, manufacturing defects. About half of our projects have been related to vision recognition or deep learning. We are not only working on consulting projects but developing vertical solution applications that run on our Lightning platform, with embedded data mining.
Our data scientists like our team because:
- We care about “best practices”
- Have a direct impact on the company’s revenue
- Give or receive mentoring as part of the collaborative process
- Questions and challenging the status quo with data is safe
- Intellectual curiosity balanced with humility
- Present papers or projects in our “Thought Leadership” meeting series, to support continuous learning









