Cutshort logo
Monza Pro logo
Data Analysis Specialist
Data Analysis Specialist
Monza Pro's logo

Data Analysis Specialist

George Paul's profile picture
Posted by George Paul
1 - 50 yrs
₹12L - ₹15L / yr
Remote only
Skills
skill iconData Analytics
Data Visualization
PowerBI
Tableau
Qlikview
Spotfire
Data-flow analysis

We are seeking a highly motivated and detail-oriented Data Analysis Specialist to join our growing [department name] team. In this role, you will be responsible for working with large datasets to identify trends, solve problems, and generate insights that inform critical business decisions. You will be a key partner to various stakeholders across the organization, translating complex data into actionable recommendations.

Responsibilities

  • Acquire data from various sources (internal databases, external sources, etc.)
  • Clean, transform, and prepare data for analysis
  • Perform statistical analysis and modeling
  • Identify trends and patterns in data
  • Develop clear and concise data visualizations (charts, graphs, dashboards)
  • Communicate findings and recommendations to stakeholders through reports and presentations
  • Collaborate with cross-functional teams (e.g., marketing, sales, product) to understand business needs and translate them into data-driven solutions
  • Stay up-to-date on the latest data analysis tools and techniques


Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Monza Pro

Founded :
2023
Type
Size :
20-100
Stage :
Bootstrapped
About

Monza is a software company on a mission to revolutionize the way businesses can access leads, monza provides a pay-per-hire business model where registered pros can bid on projects, message customers, and host live virtual consultations online without any cost upfront, they're only charged when they're hired. While providing small business owners with a guaranteed return on their investment, contrary to existing options available, Monza also provides pros with a collection of software to help run their day-to-day operations.

Read more
Tech Stack
skill iconJava
skill iconHTML/CSS
Candid answers by the company
What does the company do?
What is the location preference of jobs?

Monza provides a risk-free lead generation strategy for small business owners while providing software to run their day-to-day operations in bulk at no cost.

Company social profiles
bloglinkedintwitter

Similar jobs

Grubbrr systems Pvt Ltd
Kinjal Patel
Posted by Kinjal Patel
Ahmedabad
1 - 3 yrs
₹3L - ₹9L / yr
skill iconData Analytics
Data management
Data-flow analysis
QBR decks
Pilot sales

Job Responsibility:

* Coordinate with CSMs to complete Pilot sales tracker, QBR Decks, End of pilot Decks

*Monitor Daily Sales volume and alert CSMs of Locations with Low volume counts

*Collaborate with other teammates to extract data figures, data entry (Google Sheets & Google Slides), and monitor key performance indicators (KPIs) to determine business initiatives’ success

*Helping in Minor Integration Changes

*Helping with Tickets and giving support when required

* Basic product knowledge of the software to assist with non-client facing action items 

* Good communication skill preferred           

Read more
Adastra India
Remote only
5 - 7 yrs
₹20L - ₹30L / yr
Google Cloud Platform (GCP)
skill iconPython
SQL
Bigquery
Data-flow analysis

 Job Description

As a Senior Python DE on GCP, you will be responsible for:

  • Technical Requirements Gathering and Development of Functional Specifications
  • Analysis on Various Development Alternatives to Applications, Information systems and Modules
  • Code Development in the Area of Data Management – Cloud, Data Integration, Analytics & Reporting
  • Support to Junior Developers, Team Leadership/Mentorship
  • Support to Presales Team – Technical Whitepapers, Solutions Review
  • Experience in cloud-based platforms, Specifically GCP
  • Strong programming skills in Python and SQL
  • Career path ambition – motivation for a management position in a foreign company
  • Strong communication, presentation, and networking Skills
  • Work diligence & initiative – “Deliver no Matter What” attitude
  • Experience working with GCP resources, such as BigQuery, Cloud Function, DataFlow, Cloud Composer
  • Experience building data pipelines with Airflow or other orchestration tools preferred


Read more
InEvolution
Pavan P K
Posted by Pavan P K
Remote only
2 - 3 yrs
₹5L - ₹7L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+5 more

About InEvolution


Founded in 2009, InEvolution stands as a beacon of excellence in providing back-office, operations, and customer support services globally. Our team, comprising highly skilled professionals, is committed to delivering top-notch quality services while ensuring cost efficiency. At InEvolution, we value innovation, quality, and our team's growth and development.


About the Role


  • Work on building, processing and transferring data and dashboards from existing Domo Platform to Power BI Platform.
  • Audit existing data systems and deployments and identify errors or areas for improvement.
  • Utilize Power BI to build interactive and visually appealing dashboards and reports.
  • Build Data Documentation and explanation on parameters, filters, models and relationships used in the dashboards.
  • Review existing SQL data sources to improve, connect and integrate it effortlessly with Power BI.
  • Create, test and deploy Power BI scripts, as well as execute efficient migration practices.
  • Work closely with the current analytics team, to define requirements, migration steps and, have an open and transparent communication with the team on reviewing the migrated reports and data sources for successful outcomes.
  • Ensure all dashboards and data sources are thoroughly reviewed by the team before publishing to the production environment.
  • Convert business needs into technical specifications and establish a timeline for job completion.


Requirements & Skills:


  • 2+ years of experience in using Power BI to run DAX queries and other advanced interactive functions.
  • 2+ years of experience with Data Analysis and Data Visualization tools.
  • 1+ years of experience working with Relational Databases and building SQL queries.
  • Familiarity with Data Collection, Cleaning and Transformation processes.
  • Attention to detail and the ability to work with complex datasets.
Read more
Intuitive Technology Partners
shalu Jain
Posted by shalu Jain
Remote only
9 - 20 yrs
Best in industry
Architecture
Presales
Postsales
skill iconAmazon Web Services (AWS)
databricks
+13 more

Intuitive cloud (http://www.intuitive.cloud">www.intuitive.cloud) is one of the fastest growing top-tier Cloud Solutions and SDx Engineering solution and service company supporting 80+ Global Enterprise Customer across Americas, Europe and Middle East.

Intuitive is a recognized professional and manage service partner for core superpowers in cloud(public/ Hybrid), security, GRC, DevSecOps, SRE, Application modernization/ containers/ K8 -as-a- service and cloud application delivery.


Data Engineering:

  • 9+ years’ experience as data engineer.
  • Must have 4+ Years in implementing data engineering solutions with Databricks.
  • This is hands on role building data pipelines using Databricks. Hands-on technical experience with Apache Spark.
  • Must have deep expertise in one of the programming languages for data processes (Python, Scala). Experience with Python, PySpark, Hadoop, Hive and/or Spark to write data pipelines and data processing layers
  • Must have worked with relational databases like Snowflake. Good SQL experience for writing complex SQL transformation.
  • Performance Tuning of Spark SQL running on S3/Data Lake/Delta Lake/ storage and Strong Knowledge on Databricks and Cluster Configurations.
  • Hands on architectural experience
  • Nice to have Databricks administration including security and infrastructure features of Databricks.
Read more
Shiprocket
at Shiprocket
5 recruiters
sunil kumar
Posted by sunil kumar
Gurugram
4 - 6 yrs
₹14L - ₹30L / yr
Data-flow analysis

                                               Sr. Data Engineer

 

Company Profile:

 

Bigfoot Retail Solutions [Shiprocket] is a logistics platform which connects Indian eCommerce SMBs with logistics players to enable end to end solutions.

Our innovative data backed platform drives logistics efficiency, helps reduce cost, increases sales throughput by reducing RTO and improves post order customer engagement and experience.

Our vision is to power all logistics for the direct commerce market in India

including first mile, linehaul, last mile, warehousing, cross border and O2O.

 

Position: Sr.Data Engineer

Team : Business Intelligence

Location: New Delhi

 

Job Description:

We are looking for a savvy Data Engineer to join our growing team of analytics experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.

 

Key Responsibilities:

  • Create and maintain optimal data pipeline architecture.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centres and AWS regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

 

 

Qualifications for Data Engineer

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

 

 

 

 

 

Read more
Velocity Services
Bengaluru (Bangalore)
4 - 8 yrs
₹20L - ₹35L / yr
Data engineering
Data Engineer
Big Data
Big Data Engineer
skill iconPython
+10 more

We are an early stage start-up, building new fintech products for small businesses. Founders are IIT-IIM alumni, with prior experience across management consulting, venture capital and fintech startups. We are driven by the vision to empower small business owners with technology and dramatically improve their access to financial services. To start with, we are building a simple, yet powerful solution to address a deep pain point for these owners: cash flow management. Over time, we will also add digital banking and 1-click financing to our suite of offerings.

 

We have developed an MVP which is being tested in the market. We have closed our seed funding from marquee global investors and are now actively building a world class tech team. We are a young, passionate team with a strong grip on this space and are looking to on-board enthusiastic, entrepreneurial individuals to partner with us in this exciting journey. We offer a high degree of autonomy, a collaborative fast-paced work environment and most importantly, a chance to create unparalleled impact using technology.

 

Reach out if you want to get in on the ground floor of something which can turbocharge SME banking in India!

 

Technology stack at Velocity comprises a wide variety of cutting edge technologies like, NodeJS, Ruby on Rails, Reactive Programming,, Kubernetes, AWS, NodeJS, Python, ReactJS, Redux (Saga) Redis, Lambda etc. 

 

Key Responsibilities

  • Responsible for building data and analytical engineering pipelines with standard ELT patterns, implementing data compaction pipelines, data modelling and overseeing overall data quality

  • Work with the Office of the CTO as an active member of our architecture guild

  • Writing pipelines to consume the data from multiple sources

  • Writing a data transformation layer using DBT to transform millions of data into data warehouses.

  • Implement Data warehouse entities with common re-usable data model designs with automation and data quality capabilities

  • Identify downstream implications of data loads/migration (e.g., data quality, regulatory)

 

What To Bring

  • 3+ years of software development experience, a startup experience is a plus.

  • Past experience of working with Airflow and DBT is preferred

  • 2+ years of experience working in any backend programming language. 

  • Strong first-hand experience with data pipelines and relational databases such as Oracle, Postgres, SQL Server or MySQL

  • Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development)

  • Experienced with the formulation of ideas; building proof-of-concept (POC) and converting them to production-ready projects

  • Experience building and deploying applications on on-premise and AWS or Google Cloud cloud-based infrastructure

  • Basic understanding of Kubernetes & docker is a must.

  • Experience in data processing (ETL, ELT) and/or cloud-based platforms

  • Working proficiency and communication skills in verbal and written English.

 

 

 

Read more
The other Fruit
at The other Fruit
1 video
3 recruiters
Dipendra SIngh
Posted by Dipendra SIngh
Pune
1 - 5 yrs
₹3L - ₹15L / yr
skill iconMachine Learning (ML)
Artificial Intelligence (AI)
skill iconPython
Data Structures
Algorithms
+17 more
 
SD (ML and AI) job description:

Advanced degree in computer science, math, statistics or a related discipline ( Must have master degree )
Extensive data modeling and data architecture skills
Programming experience in Python, R
Background in machine learning frameworks such as TensorFlow or Keras
Knowledge of Hadoop or another distributed computing systems
Experience working in an Agile environment
Advanced math skills (Linear algebra
Discrete math
Differential equations (ODEs and numerical)
Theory of statistics 1
Numerical analysis 1 (numerical linear algebra) and 2 (quadrature)
Abstract algebra
Number theory
Real analysis
Complex analysis
Intermediate analysis (point set topology)) ( important )
Strong written and verbal communications
Hands on experience on NLP and NLG
Experience in advanced statistical techniques and concepts. ( GLM/regression, Random forest, boosting, trees, text mining ) and experience with application.
 
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos