- The ideal candidate is adept at using large data sets to find opportunities for product and process optimization and using models to test the effectiveness of different courses of action.
- Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies.
- Assess the effectiveness and accuracy of new data sources and data gathering techniques.
- Develop custom data models and algorithms to apply to data sets.
- Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting and other business outcomes.
- Develop company A/B testing framework and test model quality.
- Develop processes and tools to monitor and analyze model performance and data accuracy.
Roles & Responsibilities
- Experience using statistical languages (R, Python, SQL, etc.) to manipulate data and draw insights from large data sets.
- Experience working with and creating data architectures.
- Looking for someone with 3-7 years of experience manipulating data sets and building statistical models
- Has a Bachelor's, Master's in Computer Science or another quantitative field
- Knowledge and experience in statistical and data mining techniques :
- GLM/Regression, Random Forest, Boosting, Trees, text mining,social network analysis, etc.
- Experience querying databases and using statistical computer languages :R, Python, SQL, etc.
- Experience creating and using advanced machine learning algorithms and statistics: regression, simulation, scenario analysis, modeling, clustering, decision trees,neural networks, etc.
- Experience with distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark, Gurobi, MySQL, etc.
- Experience visualizing/presenting data for stakeholders using: Periscope, Business Objects, D3, ggplot, etc.
Similar jobs
Greetings!!!!
We are looking for a data engineer for one of our premium clients for their Chennai and Tirunelveli location
Required Education/Experience
● Bachelor’s degree in computer Science or related field
● 5-7 years’ experience in the following:
● Snowflake, Databricks management,
● Python and AWS Lambda
● Scala and/or Java
● Data integration service, SQL and Extract Transform Load (ELT)
● Azure or AWS for development and deployment
● Jira or similar tool during SDLC
● Experience managing codebase using Code repository in Git/GitHub or Bitbucket
● Experience working with a data warehouse.
● Familiarity with structured and semi-structured data formats including JSON, Avro, ORC, Parquet, or XML
● Exposure to working in an agile work environment
Responsibilities
Researches, develops and maintains machine learning and statistical models for
business requirements
Work across the spectrum of statistical modelling including supervised,
unsupervised, & deep learning techniques to apply the right level of solution to
the right problem Coordinate with different functional teams to monitor outcomes and refine/
improve the machine learning models Implements models to uncover patterns and predictions creating business value and innovation
Identify unexplored data opportunities for the business to unlock and maximize
the potential of digital data within the organization
Develop NLP concepts and algorithms to classify and summarize structured/unstructured text data
Qualifications
3+ years of experience solving complex business problems using machine
learning.
Fluency in programming languages such as Python, NLP and Bert, is a must
Strong analytical and critical thinking skills
Experience in building production quality models using state-of-the-art technologies
Familiarity with databases .
desirable Ability to collaborate on projects and work independently when required.
Previous experience in Fintech/payments domain is a bonus
You should have Bachelor’s or Master’s degree in Computer Science, Statistics
or Mathematics or another quantitative field from a top tier Institute
Design, implement, and improve the analytics platform
Implement and simplify self-service data query and analysis capabilities of the BI platform
Develop and improve the current BI architecture, emphasizing data security, data quality
and timeliness, scalability, and extensibility
Deploy and use various big data technologies and run pilots to design low latency
data architectures at scale
Collaborate with business analysts, data scientists, product managers, software development engineers,
and other BI teams to develop, implement, and validate KPIs, statistical analyses, data profiling, prediction,
forecasting, clustering, and machine learning algorithms
Educational
At Ganit we are building an elite team, ergo we are seeking candidates who possess the
following backgrounds:
7+ years relevant experience
Expert level skills writing and optimizing complex SQL
Knowledge of data warehousing concepts
Experience in data mining, profiling, and analysis
Experience with complex data modelling, ETL design, and using large databases
in a business environment
Proficiency with Linux command line and systems administration
Experience with languages like Python/Java/Scala
Experience with Big Data technologies such as Hive/Spark
Proven ability to develop unconventional solutions, sees opportunities to
innovate and leads the way
Good experience of working in cloud platforms like AWS, GCP & Azure. Having worked on
projects involving creation of data lake or data warehouse
Excellent verbal and written communication.
Proven interpersonal skills and ability to convey key insights from complex analyses in
summarized business terms. Ability to effectively communicate with multiple teams
Good to have
AWS/GCP/Azure Data Engineer Certification
What is the role?
You will be responsible for developing and designing front-end web architecture, ensuring the responsiveness of applications, and working alongside graphic designers for web design features, among other duties. You will be responsible for the functional/technical track of the project
Key Responsibilities
- Develop and automate large-scale, high-performance data processing systems (batch and/or streaming).
- Build high-quality software engineering practices towards building data infrastructure and pipelines at scale.
- Lead data engineering projects to ensure pipelines are reliable, efficient, testable, & maintainable
- Optimize performance to meet high throughput and scale
What are we looking for?
- 4+ years of relevant industry experience.
- Working with data at the terabyte scale.
- Experience designing, building and operating robust distributed systems.
- Experience designing and deploying high throughput and low latency systems with reliable monitoring and logging practices.
- Building and leading teams.
- Working knowledge of relational databases like Postgresql/MySQL.
- Experience with Python / Spark / Kafka / Celery
- Experience working with OLTP and OLAP systems
- Excellent communication skills, both written and verbal.
- Experience working in cloud e.g., AWS, Azure or GCP
Whom will you work with?
You will work with a top-notch tech team, working closely with the architect and engineering head.
What can you look for?
A wholesome opportunity in a fast-paced environment that will enable you to juggle between concepts, yet maintain the quality of content, interact and share your ideas and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the benefits of being at this company
We are
We strive to make selling fun with our SaaS incentive gamification product. Company is the #1 gamification software that automates and digitizes Sales Contests and Commission Programs. With game-like elements, rewards, recognitions, and complete access to relevant information, Company turbocharges an entire salesforce. Company also empowers Sales Managers with easy-to-publish game templates, leaderboards, and analytics to help accelerate performances and sustain growth.
We are a fun and high-energy team, with people from diverse backgrounds - united under the passion of getting things done. Rest assured that you shall get complete autonomy in your tasks and ample opportunities to develop your strengths.
Way forward
If you find this role exciting and want to join us in Bangalore, India, then apply by clicking below. Provide your details and upload your resume. All received resumes will be screened, shortlisted candidates will be requested to join for a discussion and on mutual alignment and agreement, we will proceed with hiring.
- Assisting developers to write efficient code.
- Tuning and debugging customer installations.
- Maintaining internal development and test databases.
- Working closely with leading experts in Supply Chain management to support various E2open teams.
- Provides on-call support in a 24x7 environment.
The position requires night shift work to support the US time zone.
Required Experience/Skills:
- Bachelor's/Master's/PhD degree in Engineering, Computer Science, Mathematics, or other Science with a consistent academic record (Preferably more than 70%)
- Oracle-certified DBA with 3+ years of DBA experience
- Oracle 12c and 19c experience is a must.
- Experience in Performance Tuning and Query Optimization is a must.
- Expertise and development experience in PL/SQL and SQL
- Experience in Oracle installation, upgrade backup, and recovery methods.
- Experience in Unix/Linux shell scripting.
- Track record of delivering quality work on time and ability to expand own expertise.
- Self-motivated, detail and team-oriented
- Solid verbal and written communication skills
- Enjoys dynamic, result-oriented work culture.
- Excellent problem-solving and troubleshooting skills.
- Experience considered a plus:
- MySQL and PostgreSQL
- Knowledge of server, storage, and networking technology.
- Supply Chain background
Data Analyst Job Duties
Data analyst responsibilities include conducting full lifecycle analysis to include requirements, activities and design. Data analysts will develop analysis and reporting capabilities. They will also monitor performance and quality control plans to identify improvements.
Responsibilities
-
Interpret data, analyze results using statistical techniques and provide ongoing reports
-
Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality
-
Acquire data from primary or secondary data sources and maintain databases/data systems
-
Identify, analyze, and interpret trends or patterns in complex data sets
-
Filter and “clean” data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems
-
Work with management to prioritize business and information needs
-
Locate and define new process improvement opportunities
Requirements
-
Proven working experience as a Data Analyst or Business Data Analyst
-
https://resources.workable.com/data-scientist-analysis-interview-questions">Technical expertise regarding data models, database design development, data mining and segmentation techniques
-
Strong knowledge of and experience with reporting packages (Business Objects etc), databases (SQL etc), programming (XML, Javascript, or ETL frameworks)
-
Knowledge of statistics and experience using statistical packages for analyzing datasets (Excel, SPSS, SAS etc)
-
Strong https://resources.workable.com/analytical-skills-interview-questions">analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
-
Adept at queries, report writing and presenting findings
-
BS in Mathematics, Economics, Computer Science, Information Management or Statistics
-
• Responsible for developing and maintaining applications with PySpark
• Contribute to the overall design and architecture of the application developed and deployed.
• Performance Tuning wrt to executor sizing and other environmental parameters, code optimization, partitions tuning, etc.
• Interact with business users to understand requirements and troubleshoot issues.
• Implement Projects based on functional specifications.
Must-Have Skills:
• Good experience in Pyspark - Including Dataframe core functions and Spark SQL
• Good customer communication.
• Good Analytical skills
About Us
Punchh is the leader in customer loyalty, offer management, and AI solutions for offline and omni-channel merchants including restaurants, convenience stores, and retailers. Punchh brings the power of online to physical brands by delivering omni-channel experiences and personalization across the entire customer journey--from acquisition through loyalty and growth--to drive same store sales and customer lifetime value. Punchh uses best-in-class integrations to POS and other in-store systems such as WiFi, to deliver real-time SKU-level transaction visibility and offer provisioning for physical stores.
Punchh is growing exponentially, serves 200+ brands that encompass 91K+ stores globally. Punchh’s customers include the top convenience stores such as Casey’s General Stores, 25+ of the top 100 restaurant brands such as Papa John's, Little Caesars, Denny’s, Focus Brands (5 of 7 brands), and Yum! Brands (KFC, Pizza Hut, and Taco Bell), and retailers. For a multi-billion $ brand with 6K+ stores, Punchh drove a 3% lift in same-store sales within the first year. Punchh is powering loyalty programs for 135+ million consumers.
Punchh has raised $70 million from premier Silicon Valley investors including Sapphire Ventures and Adam Street Partners, has a seasoned leadership team with extensive experience in digital, marketing, CRM, and AI technologies as well as deep restaurant and retail industry expertise.
About the Role:
Punchh Tech India Pvt. is looking for a Senior Data Analyst – Business Insights to join our team. If you're excited to be part of a winning team, Punchh is a great place to grow your career.
This position is responsible for discovering the important trends among the complex data generated on Punchh platform, that have high business impact (influencing product features and roadmap). Creating hypotheses around these trends, validate them with statistical significance and make recommendations
Reporting to: Director, Analytics
Job Location: Jaipur
Experience Required: 4-6 years
What You’ll Do
- Take ownership of custom data analysis projects/requests and work closely with end users (both internal and external clients) to deliver the results
- Identify successful implementation/utilization of product features and contribute to the best-practices playbook for client facing teams (Customer Success)
- Strive towards building mini business intelligence products that add value to the client base
- Represent the company’s expertise in advanced analytics in a variety of media outlets such as client interactions, conferences, blogs, and interviews.
What You’ll Need
- Masters in business/behavioral economics/statistics with a strong interest in marketing technology
- Proven track record of at least 5 years uncovering business insights, especially related to Behavioral Economics and adding value to businesses
- Proficient in using the proper statistical and econometric approaches to establish the presence and strength of trends in data. Strong statistical knowledge is mandatory.
- Extensive prior exposure in causal inference studies, based on both longitudinal and latitudinal data.
- Excellent experience using Python (or R) to analyze data from extremely large or complex data sets
- Exceptional data querying skills (Snowflake/Redshift, Spark, Presto/Athena, to name a few)
- Ability to effectively articulate complex ideas in simple and effective presentations to diverse groups of stakeholders.
- Experience working with a visualization tool (preferably, but not restricted to Tableau)
- Domain expertise: extensive exposure to retail business, restaurant business or worked on loyalty programs and promotion/campaign effectiveness
- Should be self-organized and be able to proactively identify problems and propose solutions
- Gels well within and across teams, work with stakeholders from various functions such as Product, Customer Success, Implementations among others
- As the stakeholders on business side are based out of US, should be flexible to schedule meetings convenient to the West Coast timings
- Effective in working autonomously to get things done and taking the initiatives to anticipate needs of executive leadership
- Able and willing to relocate to Jaipur post pandemic.
Benefits:
- Medical Coverage, to keep you and your family healthy.
- Compensation that stacks up with other tech companies in your area.
- Paid vacation days and holidays to rest and relax.
- Healthy lunch provided daily to fuel you through your work.
- Opportunities for career growth and training support, including fun team building events.
- Flexibility and a comfortable work environment for you to feel your best.