![Oneistox India Pvt Ltd's logo](https://cdnv2.cutshort.io/company-static/63c00c440e7f110027058761/user_uploaded_data/logos/company_logo_JPkuDBoL.png)
![skill icon](https://cdn.cutshort.io/public/images/skill_icons/data_analytics.png)
Job Description
Function: Product → Product Analytics
Responsibilities:
- Assist product managers in the formulation of the company's product strategy using structured data and insights derived from the same
- Conduct research, create business cases and translate them into meaningful problems to solve
- Measure impact of experiments related to function, analysing and helping in course correction.
- Recommending product improvements based on analytical findings. Defining new metrics, techniques, and strategies to improve performance.
- Constantly monitor and analyse metrics identified, publish insights/any anomalies along with hypothesis
- Translating business requirements and user requests into effective report and dashboard designs in challenging deadlines.
- Assist with performance tuning of dashboards, background data queries as needed
Key Skills Required:
- Bachelor’s degree along with 2+ years experience in product analytics building data sets, reports, and dashboards
- Strong analytics skills and experience in Metabase, Google Analytics, Power BI, or other analytics software
- Proficiency with SQL
- Agile ability to anticipate need, be responsive and adapt to change
- Strong interpersonal and relationship skills, ability to influence decisions and gain consensus
- Excellent time and project management skills, ability to prioritise the most important projects to create business impact
Perks at Oneistox:
- Challenging work, High Product Ownership, and Steep Learning Curve are guaranteed!
- You get to be part of a highly young and energetic team.
- Envisage the growth of a company from 5X to 500X.
- Industry standard compensation and ESOPS.
![companies logos](https://cdn.cutshort.io/public/images/hiring_companies_logos-v2.webp)
Similar jobs
![skill icon](https://cdn.cutshort.io/public/images/skill_icons/python.png)
RESPONSIBILITIES:
Requirement understanding and elicitation, analyze, data/workflows, contribute to product
project and Proof of concept (POC)
Contribute to prepare design documents and effort estimations.
Develop AI/ML Models using best in-class ML models.
Building, testing, and deploying AI/ML solutions.
Work with Business Analysts and Product Managers to assist with defining functional user
stories.
Ensure deliverables across teams are of high quality and clearly documented.
Recommend best ML practices/Industry standards for any ML use case.
Proactively take up R and D and recommend solution options for any ML use case.
REQUIREMENTS:
Required Skills
Overall experience of 4 to 7 Years working on AI/ML framework development
Good programming knowledge in Python is must.
Good Knowledge of R and SAS is desired.
Good hands on and working knowledge SQL, Data Model, CRISP-DM.
Proficiency with Uni/multivariate statistics, algorithm design, and predictive AI/ML modelling.
Strong knowledge of machine learning algorithms, linear regression, logistic regression, KNN,
Random Forest, Support Vector Machines and Natural Language Processing.
Experience with NLP and deep neural networks using synthetic and artificial data.
Involved in different phases of SDLC and have good working exposure on different SLDC’s like
Agile Methodologies.
Role Description
This is a full-time hybrid role as a GCP Data Engineer,. As a GCP Data Engineer, you will be responsible for managing large sets of structured and unstructured data and developing processes to convert data into insights, information, and knowledge.
Skill Name: GCP Data Engineer
Experience: 7-10 years
Notice Period: 0-15 days
Location :-Pune
If you have a passion for data engineering and possess the following , we would love to hear from you:
🔹 7 to 10 years of experience working on Software Development Life Cycle (SDLC)
🔹 At least 4+ years of experience in Google Cloud platform, with a focus on Big Query
🔹 Proficiency in Java and Python, along with experience in Google Cloud SDK & API Scripting
🔹 Experience in the Finance/Revenue domain would be considered an added advantage
🔹 Familiarity with GCP Migration activities and the DBT Tool would also be beneficial
You will play a crucial role in developing and maintaining our data infrastructure on the Google Cloud platform.
Your expertise in SDLC, Big Query, Java, Python, and Google Cloud SDK & API Scripting will be instrumental in ensuring the smooth operation of our data systems..
Join our dynamic team and contribute to our mission of harnessing the power of data to make informed business decisions.
The Merck Data Engineering Team is responsible for designing, developing, testing, and supporting automated end-to-end data pipelines and applications on Merck’s data management and global analytics platform (Palantir Foundry, Hadoop, AWS and other components).
The Foundry platform comprises multiple different technology stacks, which are hosted on Amazon Web Services (AWS) infrastructure or on-premise Merck’s own data centers. Developing pipelines and applications on Foundry requires:
• Proficiency in SQL / Java / Python (Python required; all 3 not necessary)
• Proficiency in PySpark for distributed computation
• Familiarity with Postgres and ElasticSearch
• Familiarity with HTML, CSS, and JavaScript and basic design/visual competency
• Familiarity with common databases (e.g. JDBC, mySQL, Microsoft SQL). Not all types required
This position will be project based and may work across multiple smaller projects or a single large project utilizing an agile project methodology.
Roles & Responsibilities:
• Develop data pipelines by ingesting various data sources – structured and un-structured – into Palantir Foundry
• Participate in end to end project lifecycle, from requirements analysis to go-live and operations of an application
• Acts as business analyst for developing requirements for Foundry pipelines
• Review code developed by other data engineers and check against platform-specific standards, cross-cutting concerns, coding and configuration standards and functional specification of the pipeline
• Document technical work in a professional and transparent way. Create high quality technical documentation
• Work out the best possible balance between technical feasibility and business requirements (the latter can be quite strict)
• Deploy applications on Foundry platform infrastructure with clearly defined checks
• Implementation of changes and bug fixes via Merck's change management framework and according to system engineering practices (additional training will be provided)
• DevOps project setup following Agile principles (e.g. Scrum)
• Besides working on projects, act as third level support for critical applications; analyze and resolve complex incidents/problems. Debug problems across a full stack of Foundry and code based on Python, Pyspark, and Java
• Work closely with business users, data scientists/analysts to design physical data models
![skill icon](https://cdn.cutshort.io/public/images/skill_icons/data_analytics.png)
● Understanding user behavior and performing root-cause analysis of changes in data trends
to identify corrections or propose desirable enhancements in product & across different
verticals
● Excellent problem solving skills and the ability to make sound judgments based on trade-offs for different solutions to complex problem constraints
● Defining and monitoring KPIs for product/content/business performance and identifying
ways to improve them
● Should be a strong advocate of data driven approach and drive analytics decisions by doing user testing, data analysis, and A/B testing
● Help in defining the analytics roadmap for the product
● Working with stakeholders, including Customer Support team, Customer Success team and senior executive management to address customer/product problems effectively.
● Providing regular reports to the management on product performance and issues.
● Managing prioritization and trade-offs between customer experience, business impact and product performance.
● Working with the PM team during the sprint releases to help enhance the product with
long-term solutions for product issues.
● Driving the collection of new data that would help build the next generation of algorithms
(E.g. audience segmentation, contextual targeting)
● Explain trends across data sources, potential opportunities for growth or improvement, and data caveats for descriptive, diagnostic, predictive (including forecasting), and prescriptive data projects
● Develop user archetypes and build dashboards to demonstrate their usage patterns
● Capture and document data user stories, use cases, and workflows
BASIC QUALIFICATIONS
- 2+ years experience in program or project management
- Project handling experience using six sigma/Lean processes
- Experience interpreting data to make business recommendations
- Bachelor’s degree or higher in Operations, Business, Project Management, Engineering
- 5-10 years' experience in project / Customer Satisfaction, with proven success record
- Understand basic and systematic approaches to manage projects/programs
- Structured problem solving approach to identify & fix problems
- Open-minded, creative and proactive thinking
- Pioneer to invent and make differences
- Understanding of customer experience, listening to customers' voice and work backwards to improve business process and operations
- Certification in 6 Sigma
PREFERRED QUALIFICATIONS
- Automation Skills with experience in Advance SQL, Python, Tableau
Responsibilities:
- Should act as a technical resource for the Data Science team and be involved in creating and implementing current and future Analytics projects like data lake design, data warehouse design, etc.
- Analysis and design of ETL solutions to store/fetch data from multiple systems like Google Analytics, CleverTap, CRM systems etc.
- Developing and maintaining data pipelines for real time analytics as well as batch analytics use cases.
- Collaborate with data scientists and actively work in the feature engineering and data preparation phase of model building
- Collaborate with product development and dev ops teams in implementing the data collection and aggregation solutions
- Ensure quality and consistency of the data in Data warehouse and follow best data governance practices
- Analyse large amounts of information to discover trends and patterns
- Mine and analyse data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies.\
Requirements
- Bachelor’s or Masters in a highly numerate discipline such as Engineering, Science and Economics
- 2-6 years of proven experience working as a Data Engineer preferably in ecommerce/web based or consumer technologies company
- Hands on experience of working with different big data tools like Hadoop, Spark , Flink, Kafka and so on
- Good understanding of AWS ecosystem for big data analytics
- Hands on experience in creating data pipelines either using tools or by independently writing scripts
- Hands on experience in scripting languages like Python, Scala, Unix Shell scripting and so on
- Strong problem solving skills with an emphasis on product development.
- Experience using business intelligence tools e.g. Tableau, Power BI would be an added advantage (not mandatory)
![skill icon](https://cdn.cutshort.io/public/images/skill_icons/data_analytics.png)
![skill icon](https://cdn.cutshort.io/public/images/skill_icons/python.png)
![skill icon](https://cdn.cutshort.io/public/images/skill_icons/r.png)
About Us:
GreedyGame is looking for a Business Analyst to join its clan. We are looking to get an enthusiastic Business Analyst who likes to play with Data. You'll be building insights from Data, creating analytical dashboard and monitoring KPI values. Also you will coordinate with teams working on different layers of the infrastructure.
Job details:
Seniority Level: Associate
Level Industry: Marketing & Advertising
Employment Type: Full Time
Job Location: Bangalore
Experience: 1-2 years
WHAT ARE WE LOOKING FOR?
- Excellent planning, organizational, and time management skills.
- Exceptional analytical and conceptual thinking skills.
- A previous experience of working closely with Operations and Product Teams.
- Competency in Excel and SQL is a must.
- Experience with a programming language like Python is required.
- Knowledge of Marketing Tools is preferable.
WHAT WILL BE YOUR RESPONSIBILITIES?
- Evaluating business processes, anticipating requirements, uncovering areas for improvement, developing and implementing solutions.
- Should be able to generate meaningful insights to help the marketing team and product team in enhancing the user experience for Mobile and Web Apps.
- Leading ongoing reviews of business processes and developing optimization strategies.
- Performing requirements analysis from a user and business point of view
- Combining data from multiple sources like SQL tables, Google Analytics, Inhouse Analytical signals etc and driving relevant insights
- Deciding the success metrics and KPIs for different Products and features and making sure they are achieved.
- Act as quality assurance liaison prior to the release of new data analysis or application.
Skills and Abilities:
- Python
- SQL
- Business Analytics
- BigQuery
WHAT'S IN IT FOR YOU?
- An opportunity to be a part of a fast scaling start-up in the AdTech space that offers unmatched services and products.
- To work with a team of young enthusiasts who are always upbeat and self-driven to achieve bigger milestones in shorter time spans.
- A workspace that is wide open as per the open door policy at the company, located in the most happening center of Bangalore.
- A well-fed stomach makes the mind work better and therefore we provide - free lunch with a wide variety on all days of the week, a stocked-up pantry to satiate your want for munchies, a Foosball table to burst stress and above all a great working environment.
- We believe that we grow as you grow. Once you are a part of our team, your growth also becomes essential to us, and in order to make sure that happens, there are timely formal and informal feedbacks given
4-6 years of total experience in data warehousing and business intelligence
3+ years of solid Power BI experience (Power Query, M-Query, DAX, Aggregates)
2 years’ experience building Power BI using cloud data (Snowflake, Azure Synapse, SQL DB, data lake)
Strong experience building visually appealing UI/UX in Power BI
Understand how to design Power BI solutions for performance (composite models, incremental refresh, analysis services)
Experience building Power BI using large data in direct query mode
Expert SQL background (query building, stored procedure, optimizing performance)
![skill icon](https://cdn.cutshort.io/public/images/skill_icons/scala.png)
![skill icon](https://cdn.cutshort.io/public/images/skill_icons/python.png)
· Advanced Spark Programming Skills · Advanced Python Skills · Data Engineering ETL and ELT Skills · Expertise on Streaming data · Experience in Hadoop eco system · Basic understanding of Cloud Platforms · Technical Design Skills, Alternative approaches |
· Hands on expertise on writing UDF’s · Hands on expertise on streaming data ingestion · Be able to independently tune spark scripts · Advanced Debugging skills & Large Volume data handling. · Independently breakdown and plan technical Tasks |
![skill icon](https://cdn.cutshort.io/public/images/skill_icons/data_analytics.png)
![skill icon](https://cdn.cutshort.io/public/images/skill_icons/data_science.png)
![skill icon](https://cdn.cutshort.io/public/images/skill_icons/python.png)
![icon](https://cdn.cutshort.io/public/images/search.png)
![companies logos](https://cdn.cutshort.io/public/images/hiring_companies_logos-v2.webp)