About the role
We’re looking for a passionate Market Research Associate, with an analytical mindset and thrust to constantly thrive with current market trends.
As a Market Research Associate you will be responsible for providing actionable insights by conducting detailed market research to provide in-depth analysis of the current market trends, competitive analysis, market share across our line of products - Scalefusion, NuovoPay, and NovoTeam.
What will you do?
- Develop a good understanding of the SaaS industry and perform in-depth analysis to evaluate the current market trends.
- Demonstrate ability to track and analyze global market development, competitive landscape, impact of economic trends on the industry, business scenario and growth prospects
- Provide analytical interpretation of the data to conceptualize and build perspectives based on industry trends and their impact
- Identify and understand change-causing trends that impact our market, as well as industries related to our products.
- Perform valid and reliable market research on competitors’ products & services and turn into actionable insights for product and marketing managers.
- Help our sales & marketing team to generate effective strategies based on secondary market research.
- Collect data on consumers, competitors and market place and consolidate information into actionable items, reports and presentations
- Understand target market characteristics and map its needs with the offerings
- Research, curate and build database(accounts & contacts) using lead generation tools like LinkedIn Sales Navigator, Zoominfo, Lusha etc to fuel sales pipeline
- Identify prospective industry events, trade shows, networking events and opportunities

About Promobi Technologies
About
Promobi Technologies:
ProMobi Technologies provides a leading Mobile Device Management Solution under the brand Scalefusion. The solution allows organizations to manage Android and iOS devices from the cloud. It offers modern mobile device management (MDM), application management (MAM) and content management (MCM) experience for corporate-owned devices. Renowned organizations from startups to Fortune 500 trust Scalefusion for their Device Management.
Scalefusion (formerly known as Mobilock Pro): (Our Flagship Product)
Scalefusion MDM allows organizations to secure & manage endpoints including smartphones, tablets, laptops, rugged devices, mPOS, and digital signages, along with apps and content. It supports the management of Android, iOS, macOS and Windows 10 devices and ensures streamlined device management operations with InterOps. Fusion of Endpoints at Scale.
Company video


Connect with the team
Similar jobs
Job Description:
We are looking for a highly creative and talented Content Writer with expertise in crafting compelling advertisements and SEO-driven blog content. This role requires someone who is not only a wordsmith but also a creative thinker, capable of translating complex ideas into clear, persuasive, and engaging copy that captures the attention of the audience. The ideal candidate will be proficient in producing ads that resonate with customers and blogs that rank well on search engines.
Key Responsibilities:
- Ad Copywriting: Create persuasive and attention-grabbing content for various advertising platforms (Google Ads, Facebook Ads, Instagram, etc.) that drive conversions and engagement. (Must have prior experience in writing effective ad copies).
- SEO Blog Writing: Write high-quality, SEO-optimized blog posts that are informative, engaging, and rank well on search engines.
- Creative Concept Development: Work closely with the marketing team to brainstorm and develop creative content ideas and campaigns that align with brand messaging and goals.
- Content Strategy: Assist in the development of content strategies to ensure all written material is aligned with the overall marketing objectives.
- Content Editing & Proofreading: Review and edit content for clarity, consistency, and accuracy, ensuring it adheres to the brand voice and guidelines.
- Trend Monitoring: Stay up-to-date with industry trends, marketing strategies, and the latest SEO best practices to continually improve content quality and engagement.
connect me at - https://www.linkedin.com/in/babitagupta7?lipi=urn%3Ali%3Apage%3Ad_flagship3_profile_view_base_contact_details%3B0hqjPd9SQoiOyl5Rh3PcRQ%3D%3D
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.


Interested ppl can share their resume with the following details
Years of experience
Current CTC
Expected CTC
Notice Period
Are you ready for Relocation
The Lead Quality Assurance (QA) Engineer will be responsible for developing and maintaining a best-in-class QA process to ensure the highest quality of software products.
This individual will work with the development team to define, design, and maintain comprehensive test plans and test cases, develop automation frameworks, and execute tests.
Essential Duties and Responsibilities
• Lead the development and implementation of a comprehensive QA process to ensure the highest quality of software products.
• Design and develop automated test plans, test cases, and test scripts.
• Execute tests and document results.
• Develop and maintain automation frameworks.
• Identify, document, and report software defects.
• Work with the development team to define, design, and maintain comprehensive test plans and test cases.
• Participate in design reviews and provide feedback on product design specifications.
• Monitor product quality on an ongoing basis and provide feedback to the development team on areas of improvement.
• Stay current with the latest technologies and trends in software development.
• Provide technical guidance and leadership to other QA engineers.
• 3-6 years of experience in designing and developing secure, scalable and highly available orchestration solutions using BPM preferably Camunda, including process and rules modelling.
• Experience in PEGA and/or TIBCO BPM systems is a plus
• Worked with very complex workflows, asynchronous tasks, user tasks, event listeners and Business Central deployments and APIs
• Ability to configure BPM workflows as per client need. Experience at client location is preferred
• Strong knowledge of BPMN2.0, DMN and CMMN. Hands on workflow configuration is preferred
• Good knowledge on CI/CD, DevOps, Scrum practices
• Ability to adapt and work in an agile fast paced environment
• Collaborates with multiple teams of developers/BA’s/Designers to implement project specifications, providing workflow support and technical guidance to less experienced team members
• Very good analytical, problem solving ability, verbal and written communication skills & expertise in client demos
BA JD.
Business Analyst, NextGen Orchestration Platform
What you will do:
You will work on fulfilling Rakuten’s vision of world’s first fully virtualized telecom network, with some of the leading vendor partners in this domain.
You will be responsible to execute the role of a Business Analyst in Telecommunications network management & central automation & orchestration team; which executes inhouse software development & products engineering.
What your experience should look like:
You have worked with some of the major players in the domain and hands-on on the OSS systems for FCAPS Management (Fault, configuration, Audit, Performance and Security)
You understand virtualization technologies of SDN, are familiar with ETSI NFV MANO, ONAP standards
Broad understanding of Open Stack & Docker / Kubernetes & Microservices Architecture.
Broad understanding of Software development on an AGILE framework.
What we also look for:
Solid team player & Individual contributor.
Think from a product, end user perspective, able to understand the customer journey and propose solutions.
Ability to think design patterns to build micro service based products
Ability to work in a highly competitive, matrix organization to build and maintain successful cross domain relationships.
Certifications:
Preferred
Agile/Scrum Master certifications
Project Management certifications.
Requirements
-
Previous work experience in Web3 Product Design
-
Or experience in software design
Responsibilities
Hiring Process
-
Application Review
-
Introductory Interview
-
Technical Interview
-
Offer Letter

Our client focuses on providing solutions in terms of data, analytics, decisioning and automation. They focus on providing solutions to the lending lifecycle of financial institutions and their products are designed to focus on systemic fraud prevention, risk management, compliance etc.
Our client is a one stop solution provider, catering to the authentication, verification and diligence needs of various industries including but not limited to, banking, insurance, payments etc.
Headquartered in Mumbai, our client was founded in 2015 by a team of three veteran entrepreneurs, two of whom are chartered accountants and one is a graduate from IIT, Kharagpur. They have been funded by tier 1 investors and have raised $1.1M in funding.
What you will do:
- Developing a deep understanding of our vast data sources on the web and knowing exactly how, when, and which data to scrap, parse and store
- Working closely with Database Administrators to store data in SQL and NoSQL databases
- Developing frameworks for automating and maintaining constant flow of data from multiple sources
- Working independently with little supervision to research and test innovative solutions skills
Desired Candidate Profile
What you need to have:- Bachelor/ Master’s degree in Computer science/ Computer Engineering/ Information Technology
- 1 - 5 years of relevant experience
- Strong coding experience in Python (knowledge of Java, JavaScript is a plus)
- Experience with SQL and NoSQL databases
- Experience with multi-processing, multi-threading, and AWS/Azure
- Strong knowledge of scraping frameworks such as Python (Request, Beautiful Soup), Web Harvest and others
- In depth knowledge of algorithms and data structures & previous experience with web crawling is a must


1 Good command in either Python(django) or Python (flask)
2 Has worked on large scale
3 Experience in building REST APIs
4 Proficiency with databases such as MySQL, Oracle and MongoDB
5 knowledge of Kubernetes, docker and deployment
KEY TASKS AND DUTIES:
1. Prospecting for new clients/ lead generation
2. Direct communication with customers via e‐mail and telephone to promote Bit plane and its product range
3. Execute marketing plan/marketing campaigns (e.g. targeted e‐shot, mail shot)
4. Follow up and qualify contacts from web registrations, promotions, events and other marketing and sales activities
5. Preparation and presentation of results from campaigns
6. Organizing visits for Bit plane Regional Sales Engineers when required
7. Recording of all sales activities in the CRM
8. Market research feedback
9. Any other duties as may be reasonably requested from time to time by your Line Manager





