Cutshort logo
Product Analyst
2 - 6 yrs
₹15L - ₹25L / yr (ESOP available)
Full time
Bengaluru (Bangalore)
Skills
SQL
Tableau
PowerBI

JOB DESCRIPTION

Product Analyst

About Us:-

 

"Slack for Construction"

Early stage startup cofounded by IIT - Roorkee alumnis. A Mobile-based operating system to manage construction & architectural projects. Material, all the info is shared over whatsapp, mobile app to manage all this in one single place - almost like a slack tool for construction.Mobile app + SAAS platform - administration and management of the process, 150000 users, subscription based pricing.It helps construction project owners and contractors track on-site progress in real-time to finish projects on time and in budget. We aim to bring the speed of software development to infrastructure development.Founded by IIT Roorkee alumni and backed by industry experts, we are on a mission to help the second largest industry in India-Construction make a transition from pen and paper to digital.

About the team

As a productivity app startup, we value productivity and ownership most. That helps raise our own bar and the bar of people we hire.We follow agile and scrum approaches for product development and use best of class tools and practices. Measuring our progress on a weekly basis and iterating fast enables us to build breakthrough modules and features rapidly.If you join us, You will be constantly thrown into challenging situations. Decisions that you take, will directly impact our clients and sales. That's how we learn.

Techstack -

  • Prior experience in any data driven decision making field.
  • Working knowledge of querying data using SQL.
  • Familiarity with customer and business data analytic tools like Segment, Mix-panel, Google Analytics, SmartLook etc.
  • Data visualisation tools like Tableau, Power BI, etc.
  •  

Responsibility -

"All things data"

  • Ability to synthesize complex data into actionable goals.
  • Critical thinking skills to recommend original and productive ideas
  • Ability to visualise user stories and create user funnels
  • Perform user test sessions and market surveys to inform product development teams
  • Excellent writing skills to prepare detailed product specification and analytic reports
  • Help define Product strategy / Roadmaps with scalable architecture
  • Interpersonal skills to work collaboratively with various stakeholders who may have competing interests

 

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Mobile-based OS,manages construction & architectural projcts

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring

Similar jobs

Gurugram, Bengaluru (Bangalore), Mumbai
4 - 9 yrs
Best in industry
Machine Learning (ML)
Data Science
media analytics
SQL
Python
+4 more

Our client combines Adtech and Martech platform strategy with data science & data engineering expertise, helping our clients make advertising work better for people.

 
Key Role:
  • Act as primary day-to-day contact on analytics to agency-client leads
  • Develop bespoke analytics proposals for presentation to agencies & clients, for delivery within the teams
  • Ensure delivery of projects and services across the analytics team meets our stakeholder requirements (time, quality, cost)
  • Hands on platforms to perform data pre-processing that involves data transformation as well as data cleaning
  • Ensure data quality and integrity
  • Interpret and analyse data problems
  • Build analytic systems and predictive models
  • Increasing the performance and accuracy of machine learning algorithms through fine-tuning and further
  • Visualize data and create reports
  • Experiment with new models and techniques
  • Align data projects with organizational goals


Requirements

  • Min 6 - 7 years’ experience working in Data Science
  • Prior experience as a Data Scientist within a digital media is desirable
  • Solid understanding of machine learning
  • A degree in a quantitative field (e.g. economics, computer science, mathematics, statistics, engineering, physics, etc.)
  • Experience with SQL/ Big Query/GMP tech stack / Clean rooms such as ADH
  • A knack for statistical analysis and predictive modelling
  • Good knowledge of R, Python
  • Experience with SQL, MYSQL, PostgreSQL databases
  • Knowledge of data management and visualization techniques
  • Hands-on experience on BI/Visual Analytics Tools like PowerBI or Tableau or Data Studio
  • Evidence of technical comfort and good understanding of internet functionality desirable
  • Analytical pedigree - evidence of having approached problems from a mathematical perspective and working through to a solution in a logical way
  • Proactive and results-oriented
  • A positive, can-do attitude with a thirst to continually learn new things
  • An ability to work independently and collaboratively with a wide range of teams
  • Excellent communication skills, both written and oral
Read more
Cloth software company
Agency job
via Jobdost by Sathish Kumar
Delhi
1 - 3 yrs
₹1L - ₹6L / yr
SQL
Data Analytics

What you will do:

  • Understand the process of CaaStle business teams, KPIs, and pain points
  • Build scalable data products, self-service tools, data cubes to analyze and present data associated with acquisition, retention, product performance, operations, client services, etc.
  • Closely partner with data engineering, product, and business teams and participate in requirements capture, research design, data collection, dashboard generation, and translation of results into actionable insights that can add value for business stakeholders
  • Leverage advanced analytics to drive key success metrics for business and revenue generation
  • Operationalize, implement, and automate changes to drive data-driven decisions
  • Attend and play an active role in answering questions from the executive and/or business teams through data mining and analysis

We would love for you to have:

  • Education: Advanced degree in Computer Science, Statistics, Mathematics, Engineering, Economics, Business Analytics or related field is required
  • Experience: 2-4 years of professional experience
  • Proficiency in data visualization/reporting tools (i.e. Tableau, Qlikview, etc.)
  • Experience in A/B testing and measure performance of experiments
  • Strong proficiency with SQL-based languages. Experience with large scale data analytics technologies (i.e., Hadoop and Spark)
  • Strong analytical skills and business mindset with the ability to translate complex concepts and analysis into clear and concise takeaways to drive insights and strategies
  • Excellent communication, social, and presentation skills with meticulous attention to detail
  • Programming experience in Python, R, or other languages
  • Knowledge of Data mining, statistical modeling approaches, and techniques

 

CaaStle is committed to equality of opportunity in employment. It has been and will continue to be the policy of CaaStle to provide full and equal employment opportunities to all employees and candidates for employment without regard to race, color, religion, national or ethnic origin, veteran status, age, sexual orientation, gender identity, or physical or mental disability. This policy applies to all terms, conditions and privileges of employment, such as those pertaining to training, transfer, promotion, compensation and recreational programs.

Read more
Codejudge
at Codejudge
2 recruiters
Vaishnavi M
Posted by Vaishnavi M
Bengaluru (Bangalore)
3 - 7 yrs
₹20L - ₹25L / yr
SQL
Python
Data architecture
Data mining
Data Analytics
Job description
  • The ideal candidate is adept at using large data sets to find opportunities for product and process optimization and using models to test the effectiveness of different courses of action.
  • Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies.
  • Assess the effectiveness and accuracy of new data sources and data gathering techniques.
  • Develop custom data models and algorithms to apply to data sets.
  • Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting and other business outcomes.
  • Develop company A/B testing framework and test model quality.
  • Develop processes and tools to monitor and analyze model performance and data accuracy.

Roles & Responsibilities

  • Experience using statistical languages (R, Python, SQL, etc.) to manipulate data and draw insights from large data sets.
  • Experience working with and creating data architectures.
  • Looking for someone with 3-7 years of experience manipulating data sets and building statistical models
  • Has a Bachelor's, Master's in Computer Science or another quantitative field
  • Knowledge and experience in statistical and data mining techniques :
  • GLM/Regression, Random Forest, Boosting, Trees, text mining,social network analysis, etc.
  • Experience querying databases and using statistical computer languages :R, Python, SQL, etc.
  • Experience creating and using advanced machine learning algorithms and statistics: regression, simulation, scenario analysis, modeling, clustering, decision trees,neural networks, etc.
  • Experience with distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark, Gurobi, MySQL, etc.
  • Experience visualizing/presenting data for stakeholders using: Periscope, Business Objects, D3, ggplot, etc.
Read more
Chennai, Coimbatore, Bengaluru (Bangalore), Pune, Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Kochi (Cochin), Kolkata
6 - 9 yrs
₹6L - ₹10L / yr
Data Warehouse (DWH)
Informatica
ETL
Amazon Web Services (AWS)
SQL
+1 more
Designation: Informatica cloud(IICS)
Experience:
6-9 Years
Location:
Pan India
Job Description

 
Must have: Work experience in Informatica Intelligent Cloud Services and SQL, Data analyst.
 
Roles and responsibilities
 
Tracking/management of all software assets to include internal license assessments
Participate in software vendor contract/license negotiations and the development of software licenses and associated maintenance contracts.
Knowledge in SW license procurement & hands on experience in Ariba tool.
Prepare and assist in the performance of periodic compliance report
Provide support to end users regarding specific vendor product use rights
Assist in the establishment of internal and controls related to software asset management, governance and compliance
Exposure in Oracle, MS, IBM, Adobe and other enterprise licensing.
Handled Cloud – AWS, Amazon, GCP & Billing
Participated in license compliance audit.
Tracking/management of software license governance and compliance in accordance with enterprise policy, process, procedures and controls by internal staff and external service providers
Required :
             Working knowledge in Remedy, Service Now or any IT Asset tool platform
             IT Asset Management and Discovery Tools experience
             Experience interpreting licensing terms and conditions
             Conception knowledge of Information Technology
Read more
Tredence
Bengaluru (Bangalore), Pune, Gurugram, Chennai
8 - 12 yrs
₹12L - ₹30L / yr
Snow flake schema
Snowflake
SQL
Data modeling
Data engineering
+1 more

JOB DESCRIPTION:. THE IDEAL CANDIDATE WILL:

• Ensure new features and subject areas are modelled to integrate with existing structures and provide a consistent view. Develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audiences. Provide direction on adoption of Cloud technologies (Snowflake) and industry best practices in the field of data warehouse architecture and modelling.

• Providing technical leadership to large enterprise scale projects. You will also be responsible for preparing estimates and defining technical solutions to proposals (RFPs). This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project Roles & Responsibilities.

ELIGIBILITY CRITERIA: Desired Experience/Skills:
• Must have total 5+ yrs. in IT and 2+ years' experience working as a snowflake Data Architect and 4+ years in Data warehouse, ETL, BI projects.
• Must have experience at least two end to end implementation of Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle.

• Expertise in Snowflake – data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
• Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
• Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
• Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
• Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
• Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling)
• Experience with data security and data access controls and design
• Experience with AWS or Azure data storage and management technologies such as S3 and ADLS
• Build processes supporting data transformation, data structures, metadata, dependency and workload management
• Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
• Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface
• Must have expertise in AWS or Azure Platform as a Service (PAAS)
• Certified Snowflake cloud data warehouse Architect (Desirable)
• Should be able to troubleshoot problems across infrastructure, platform and application domains.
• Must have experience of Agile development methodologies
• Strong written communication skills. Is effective and persuasive in both written and oral communication

Nice to have Skills/Qualifications:Bachelor's and/or master’s degree in computer science or equivalent experience.
• Strong communication, analytical and problem-solving skills with a high attention to detail.

 

About you:
• You are self-motivated, collaborative, eager to learn, and hands on
• You love trying out new apps, and find yourself coming up with ideas to improve them
• You stay ahead with all the latest trends and technologies
• You are particular about following industry best practices and have high standards regarding quality

Read more
Mumbai
5 - 7 yrs
₹20L - ₹25L / yr
ETL
Talend
OLAP
Data governance
SQL
+8 more
  • Key responsibility is to design and develop a data pipeline including the architecture, prototyping, and development of data extraction, transformation/processing, cleansing/standardizing, and loading in Data Warehouse at real-time/near the real-time frequency. Source data can be structured, semi-structured, and/or unstructured format.
  • Provide technical expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of  Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
  • Development of complex data transformation using Talend (BigData edition), Python/Java transformation in Talend, SQL/Python/Java UDXs, AWS S3, etc to load in OLAP Data Warehouse in Structured/Semi-structured form
  • Development of data model and creating transformation logic to populate models for faster data consumption with simple SQL.
  • Implementing automated Audit & Quality assurance checks in Data Pipeline
  • Document & maintain data lineage to enable data governance
  • Coordination with BIU, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc

Requirements

  • Programming experience using Python / Java, to create functions / UDX
  • Extensive technical experience with SQL on RDBMS (Oracle/MySQL/Postgresql etc) including code optimization techniques
  • Strong ETL/ELT skillset using Talend BigData Edition. Experience in Talend CDC & MDM functionality will be an advantage.
  • Experience & expertise in implementing complex data pipelines, including semi-structured & unstructured data processing
  • Expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of  Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
  • Good understanding & working experience in OLAP Data Warehousing solutions (Redshift, Synapse, Snowflake, Teradata, Vertica, etc) and cloud-native Data Lake (S3, ADLS, BigQuery, etc) solutions
  • Familiarity with AWS tool stack for Storage & Processing. Able to recommend the right tools/solutions available to address a technical problem
  • Good knowledge of database performance and tuning, troubleshooting, query optimization, and tuning
  • Good analytical skills with the ability to synthesize data to design and deliver meaningful information
  • Good knowledge of Design, Development & Performance tuning of 3NF/Flat/Hybrid Data Model
  • Know-how on any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc) will be an advantage.
  • Ability to understand business functionality, processes, and flows
  • Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently

Functional knowledge

  • Data Governance & Quality Assurance
  • Distributed computing
  • Linux
  • Data structures and algorithm
  • Unstructured Data Processing
Read more
Fragma Data Systems
at Fragma Data Systems
8 recruiters
Priyanka U
Posted by Priyanka U
Remote only
4 - 10 yrs
₹12L - ₹23L / yr
Informatica
ETL
Big Data
Spark
SQL
Skill:- informatica with big data management
 
1.Minimum 6 to 8 years of experience in informatica BDM development
2. Experience working on Spark/SQL
3. Develops informtica mapping/Sql 
4. Should have experience in Hadoop, spark etc

Work days- Sun-Thu
Day shift
 
 
 
Read more
Prescience Decision Solutions
Shivakumar K
Posted by Shivakumar K
Bengaluru (Bangalore)
3 - 7 yrs
₹10L - ₹20L / yr
Big Data
ETL
Spark
Apache Kafka
Apache Spark
+4 more

The Data Engineer would be responsible for selecting and integrating Big Data tools and frameworks required. Would implement Data Ingestion & ETL/ELT processes

Required Experience, Skills and Qualifications:

  • Hands on experience on Big Data tools/technologies like Spark,  Databricks, Map Reduce, Hive, HDFS.
  • Expertise and excellent understanding of big data toolset such as Sqoop, Spark-streaming, Kafka, NiFi
  • Proficiency in any of the programming language: Python/ Scala/  Java with 4+ years’ experience
  • Experience in Cloud infrastructures like MS Azure, Data lake etc
  • Good working knowledge in NoSQL DB (Mongo, HBase, Casandra)
Read more
Chennai, Bengaluru (Bangalore), Hyderabad
4 - 10 yrs
₹9L - ₹20L / yr
Informatica
informatica developer
Informatica MDM
Data integration
Informatica Data Quality
+7 more
  • Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
  • Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
  • Experience with the SIF framework including real-time integration
  • Should have experience in building C360 Insights using Informatica
  • Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
  • Should have experience in building different data warehouse architecture like Enterprise,
  • Federated, and Multi-Tier architecture.
  • Should have experience in configuring Informatica Data Director in reference to the Data
  • Governance of users, IT Managers, and Data Stewards.
  • Should have good knowledge in developing complex PL/SQL queries.
  • Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
  • Should know about Informatica Server installation and knowledge on the Administration console.
  • Working experience with Developer with Administration is added knowledge.
  • Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
  • Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment
Read more
PriceSenz
at PriceSenz
4 recruiters
Karthik Padmanabhan
Posted by Karthik Padmanabhan
Remote only
2 - 15 yrs
₹1L - ₹20L / yr
ETL
SQL
Informatica PowerCenter

If you are an outstanding ETL Developer with a passion for technology and looking forward to being part of a great development organization, we would love to hear from you. We are offering technology consultancy services to our Fortune 500 customers with a primary focus on digital technologies. Our customers are looking for top-tier talents in the industry and willing to compensate based on your skill and expertise. The nature of our engagement is Contract in most cases. If you are looking for the next big step in your career, we are glad to partner with you. 

 

Below is the job description for your review.

Extensive hands- on experience in designing and developing ETL packages using SSIS

Extensive experience in performance tuning of SSIS packages

In- depth knowledge of data warehousing concepts and ETL systems, relational databases like SQL Server 2012/ 2014.

Read more
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos