Survey Analytics Analyst
We are looking for candidates who have demonstrated both a strong business sense and deep understanding of the quantitative foundations of modelling.
• Excellent analytical and problem-solving skills, including the ability to disaggregate issues, identify root causes and recommend solutions
• Statistical programming software experience in SPSS and comfortable working with large data sets.
• R, Python, SAS & SQL are preferred but not a mandate
• Excellent time management skills
• Good written and verbal communication skills; understanding of both written and spoken English
• Strong interpersonal skills
• Ability to act autonomously, bringing structure and organization to work
• Creative and action-oriented mindset
• Ability to interact in a fluid, demanding and unstructured environment where priorities evolve constantly, and methodologies are regularly challenged
• Ability to work under pressure and deliver on tight deadlines
Qualifications and Experience:
• Graduate degree in: Statistics/Economics/Econometrics/Computer
Science/Engineering/Mathematics/MBA (with a strong quantitative background) or
equivalent
• Strong track record work experience in the field of business intelligence, market
research, and/or Advanced Analytics
• Knowledge of data collection methods (focus groups, surveys, etc.)
• Knowledge of statistical packages (SPSS, SAS, R, Python, or similar), databases,
and MS Office (Excel, PowerPoint, Word)
• Strong analytical and critical thinking skills
• Industry experience in Consumer Experience/Healthcare a plus
About Leading Management Consulting Firm
Similar jobs
● Knowledge of Excel,SQL and writing code in python.
● Experience with Reporting and Business Intelligence tools like Tableau, Metabase.
● Exposure with distributed analytics processing technologies is desired (e.g. Hive, Spark).
● Experience with Clevertap, Mixpanel, Amplitude, etc.
● Excellent communication skills.
● Background in market research and project management.
● Attention to detail.
● Problem-solving aptitude.
Science)
Have 2 to 6 years of experience working in a similar role in a startup environment
SQL and Excel have no secrets for you
You love visualizing data with Tableau
Any experience with product analytics tools (Mixpanel, Clevertap) is a plus
You solve math puzzles for fun
A strong analytical mindset with a problem-solving attitude
Comfortable with being critical and speaking your mind
You can easily switch between coding (R or Python) and having a business
discussion
Be a team player who thrives in a fast-paced and constantly changing environment
Job role:
As a data analyst, you will be responsible for compiling actionable insights from data and assisting program, sales and marketing managers build data-driven processes. Your role will involve driving initiatives to optimize for operational excellence and revenue.
Responsibilities:
- Ensure that data flows smoothly from source to destination so that it can be processed
- Utilize strong database skills to work with large, complex data sets to extract insights
- Filter and cleanse unstructured (or ambiguous) data into usable data sets that can be analyzed to extract insights and improve business processes
- Identify new internal and external data sources to support analytics initiatives and work with appropriate partners to absorb the data into new or existing data infrastructure
- Build tools for automating repetitive asks so that bandwidth can be freed for analytics
- Collaborate with program managers and business analysts to help them come up with actionable, high-impact insights across product lines and functions
- Work closely with top management to prioritize information and analytic needs
Requirements:
- Excel
- Scrapping
- Medium Communication Skill (Spoken and written)
- SaaS industries knowledge
- Add Product
- Check Dodontdo
- Scrapping
- Work on Support Tickets (Resolve customers issue)
Web/Data/Execl Scraping must
6+ years’ experience in Azure architecture, Azure Data Services, design and development. Significant experience as DW architect on several initiatives. • Rich Data/Dimensional modeling expertise • Expert knowledge of Azure tools and services centered around data and analytics (Azure Synapse, Data Lake, Data Factory etc.) • Experience designing and building complete ETL processes moving and transforming data from ODS, Staging and Data Warehousing, cloud and hybrid. Education and Experience: • Bachelor’s Degree in computer science or related field required. • 10 or more years’ experience managing and designing systems in an enterprise infrastructure environment required.
We are #hiring for AWS Data Engineer expert to join our team
Job Title: AWS Data Engineer
Experience: 5 Yrs to 10Yrs
Location: Remote
Notice: Immediate or Max 20 Days
Role: Permanent Role
Skillset: AWS, ETL, SQL, Python, Pyspark, Postgres DB, Dremio.
Job Description:
Able to develop ETL jobs.
Able to help with data curation/cleanup, data transformation, and building ETL pipelines.
Strong Postgres DB exp and knowledge of Dremio data visualization/semantic layer between DB and the application is a plus.
Sql, Python, and Pyspark is a must.
Communication should be good
Work Timings:4:00PM to 11:30PM
Fulltime WFH
6+ Yrs in Data science
Strong Experience ML Regression, Classification, Anomaly detection, NLP, Deep learning, Predictive analytics, Predictive maintenance ,Python, Added advantage Data visualization
- Does analytics to extract insights from raw historical data of the organization.
- Generates usable training dataset for any/all MV projects with the help of Annotators, if needed.
- Analyses user trends, and identifies their biggest bottlenecks in Hammoq Workflow.
- Tests the short/long term impact of productized MV models on those trends.
- Skills - Numpy, Pandas, SPARK, APACHE SPARK, PYSPARK, ETL mandatory.
Required Python ,R
work in handling large-scale data engineering pipelines.
Excellent verbal and written communication skills.
Proficient in PowerPoint or other presentation tools.
Ability to work quickly and accurately on multiple projects.
Roles & Responsibilities
- Proven experience with deploying and tuning Open Source components into enterprise ready production tooling Experience with datacentre (Metal as a Service – MAAS) and cloud deployment technologies (AWS or GCP Architect certificates required)
- Deep understanding of Linux from kernel mechanisms through user space management
- Experience on CI/CD (Continuous Integrations and Deployment) system solutions (Jenkins).
- Using Monitoring tools (local and on public cloud platforms) Nagios, Prometheus, Sensu, ELK, Cloud Watch, Splunk, New Relic etc. to trigger instant alerts, reports and dashboards. Work closely with the development and infrastructure teams to analyze and design solutions with four nines (99.99%) up-time, globally distributed, clustered, production and non-production virtualized infrastructure.
- Wide understanding of IP networking as well as data centre infrastructure
Skills
- Expert with software development tools and sourcecode management, understanding, managing issues, code changes and grouping them into deployment releases in a stable and measurable way to maximize production Must be expert at developing and using ansible roles and configuring deployment templates with jinja2.
- Solid understanding of data collection tools like Flume, Filebeat, Metricbeat, JMX Exporter agents.
- Extensive experience operating and tuning the kafka streaming data platform, specifically as a message queue for big data processing
- Strong understanding and must have experience:
- Apache spark framework, specifically spark core and spark streaming,
- Orchestration platforms, mesos and kubernetes,
- Data storage platforms, elasticstack, carbon, clickhouse, cassandra, ceph, hdfs
- Core presentation technologies kibana, and grafana.
- Excellent scripting and programming skills (bash, python, java, go, rust). Must have previous experience with “rust” in order to support, improve in house developed products
Certification
Red Hat Certified Architect certificate or equivalent required CCNA certificate required 3-5 years of experience running open source big data platforms
- Build a team with skills in ETL, reporting, MDM and ad-hoc analytics support
- Build technical solutions using latest open source and cloud based technologies
- Work closely with offshore senior consultant, onshore team and client's business and IT teams to gather project requirements
- Assist overall project execution from India - starting from project planning, team formation system design and development, testing, UAT and deployment
- Build demos and POCs in support of business development for new and existing clients
- Prepare project documents and PowerPoint presentations for client communication
- Conduct training sessions to train associates and help shape their growth