- Experience in Analytics
- Min 2-3 years of exp. in Tableau
- Strong experience with design and development of Tableau visualizations
- Strong SQL.(Oracle or SQL Server)
- Create complex formulas and calculations within Tableau to meet the needs of complex business logic.
- Create action filters, parameters and calculations for preparing dashboards and worksheets in Tableau.
- Experience in Production support and issue resolution that includes debugging of Reporting related Issues, Dashboard design issues, Archiving and Performance Issues.
- Fix bugs, troubleshoot and resolve problems.
- Ownership of development, testing and production.
- Good communication skills is essential
About IDC Technologies, Inc.
Data Scientist (Risk)/Sr. Data Scientist (Risk)
As a part of the Data science/Analytics team at Rupifi, you will play a significant role in helping define the business/product vision and deliver it from the ground up by working with passionate high-performing individuals in a very fast-paced working environment.
You will work closely with Data Scientists & Analysts, Engineers, Designers, Product Managers, Ops Managers and Business Leaders, and help the team make informed data driven decisions and deliver high business impact.
Preferred Skills & Responsibilities:
- Analyze data to better understand potential risks, concerns and outcomes of decisions.
- Aggregate data from multiple sources to provide a comprehensive assessment.
- Past experience of working with business users to understand and define inputs for risk models.
- Ability to design and implement best in class Risk Models in Banking & Fintech domain.
- Ability to quickly understand changing market trends and incorporate them into model inputs.
- Expertise in statistical analysis and modeling.
- Ability to translate complex model outputs into understandable insights for business users.
- Collaborate with other team members to effectively analyze and present data.
- Conduct research into potential clients and understand the risks of accepting each one.
- Monitor internal and external data points that may affect the risk level of a decision.
- Hands-on experience in Python & SQL.
- Hands-on experience in any visualization tool preferably Tableau
- Hands-on experience in Machine & Deep Learning area
- Experience in handling complex data sources
- Experience in modeling techniques in the fintech/banking domain
- Experience of working on Big data and distributed computing.
- A BTech/BE/MSc degree in Math, Engineering, Statistics, Economics, ML, Operations Research, or similar quantitative field.
- 3 to 10 years of modeling experience in the fintech/banking domain in fields like collections, underwriting, customer management, etc.
- Strong analytical skills with good problem solving ability
- Strong presentation and communication skills
- Experience in working on advanced machine learning techniques
- Quantitative and analytical skills with a demonstrated ability to understand new analytical concepts.
Who Are We?
Vahak (https://www.vahak.in) is India’s largest & most trusted online transport marketplace & directory for road transport businesses and individual commercial vehicle (Trucks, Trailers, Containers, Hyva, LCVs) owners for online truck and load booking, transport business branding and transport business network expansion. Lorry owners can find intercity and intracity loads from all over India and connect with other businesses to find trusted transporters and best deals in the Indian logistics services market. With the Vahak app, users can book loads and lorries from a live transport marketplace with over 5 Lakh + Transporters and Lorry owners in over 10,000+ locations for daily transport requirements.
Vahak has raised a capital of $5+ Million in a Pre-Series A round from RTP Global along with participation from Luxor Capital and Leo Capital. The other marquee angel investors include Kunal Shah, Founder and CEO, CRED; Jitendra Gupta, Founder and CEO, Jupiter; Vidit Aatrey and Sanjeev Barnwal, Co-founders, Meesho; Mohd Farid, Co-founder, Sharechat; Amrish Rau, CEO, Pine Labs; Harsimarbir Singh, Co-founder, Pristyn Care; Rohit and Kunal Bahl, Co-founders, Snapdeal; and Ravish Naresh, Co-founder and CEO, Khatabook.
We at Vahak, are looking for an enthusiastic and passionate Data Scientist to join our young & diverse team.You will play a key role in the data science group, crunching numbers, building advanced analytical models and predicting critical business metrics from the volumes of big data.
Our goal as a group is to drive powerful, big data analytics products with scalable results.We love people who are humble and collaborative with hunger for excellence.
- Be the go-to person for all advanced analytics (ML/AI) use cases within the larger data science group
- Build predictive models and machine-learning algorithms to solve business problems by leveraging both batch and real-time datasets
- Collaborate with engineering and product development teams in the data collection and deployment phase of the model building process
- Present the model findings using a crisp presentation and use of data visualization techniques
- Analyze large amounts of information to discover trends and patterns
- Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies.
- Bachelor’s or Masters in a highly numerate discipline such as Engineering, Science and Economics
- 2+ years of proven experience working as a Data Scientist preferably in ecommerce/web based or consumer technologies company
- Hands on experience in building machine learning models from scratch and deploying the same for large scale use cases
- Hands on experience of working with machine learning frameworks,libraries, data structures and data modelling techniques
- Strong problem solving skills with an emphasis on product development.
- Experience using statistical computer languages (R, Python, SLQ, etc.) to manipulate data and draw insights from large data sets.
- Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks.
- Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications.
- Demonstrated experience of participating in Data Science competitions on platforms like Kaggle would be an added advantage
- Experience using business intelligence tools e.g. Tableau, Power BI would be an added advantage (not mandatory)
Hypersonix.ai - Data Scientist
As a Data Scientist with Hypersonix (https://hypersonix.ai/), you will play a key role in translating data into insights for our clients. You will design, develop and implement processes and framework into our AI- Platform, that will help our clients make sense of the data they generate, and consume the insights to make informed decisions.
Role – Analytics
Solve business problems & develop a business solution: Use problem-solving methodologies to propose creative solutions to solve a business problem. Recommend design and develop state-of-the-art data-driven analysis using statistical & advanced analytics methodologies to solve business problems. Develop models & recommend insights. Form hypothesis and run experiments to gain empirical insights and validate the hypothesis. Identify and eliminate possible obstacles and identify an alternative creative solution.
- Experience in design and review of new solution concepts and leading the delivery of high-impact analytics solutions and programs for global clients
- Identify opportunities and partner with key stakeholders to set priorities, manage expectations, facilitate change required to activate insights, and measure the impact
- Deconstruct problems and goals to form a clear picture for hypothesis generation and use best practices around decision science approaches and technology to solve business challenges;
- Integrate custom analytical solutions (e.g., predictive modeling, segmentation, issue tree frameworks) to support data-driven decision-making;
- Translate and communicate results, recommendations, and opportunities to improve data solutions to internal and external leadership with easily consumable reports and presentations.
- Should be able to apply domain knowledge to functional areas like market size estimation, business growth strategy, strategic revenue management, marketing effectiveness
- Have business acumen to manage revenues profitably and meet financial goals consistently. Able to quantify business value for clients and create win-win commercial propositions.
- Good thought leadership & ability to structure & solve business problems, innovating, where required
- Must have the ability to adapt to changing business priorities in a fast-paced business environment
- Should have the ability to handle structured /unstructured data and have prior experience in loading, validating, and cleaning various types of data
- Should have a very good understanding of data structures and algorithms
- Experience leading and working independently on end-to-end projects in a fast-paced environment is strongly preferred
- Advanced knowledge of SQL/redshift with proficiency with Python/R
- Sound knowledge of advanced analytics and machine learning techniques such as segmentation/clustering, recommendation engines, propensity models, and forecasting to drive growth throughout the customer lifecycle. Should be able to evaluate and bring in new advanced techniques to enhance the value-add for clients
We are hiring for Data Engineer.
- Exp: 2-4 Years
- CTC: Up to 10 LPA
- Location: Remote, Pune, Gurugram, New Delhi
- Experience designing, building and maintaining data architecture and warehousing using AWS services
- Authoritative in ETL optimization, designing, coding, and tuning big data processes using Apache Spark, R, Python, C# and/or similar technologies
- Experience managing AWS resources using Terraform
- Experience in Data engineering and infrastructure work for analytical and machine learning processes
- Experience with ETL tooling, migrating ETL code from one technology to another will be a benefit
- Experience with Data visualisation / dashboarding tools as QA/QC data processes
If interested kindly share your updated cv at [email protected] tigihr. com
2+ years of Analytics with predominant experience in SQL, SAS, Statistics, R , Python, Visualization
Experienced in writing complex SQL select queries (window functions & CTE’s) with advanced SQL experience
Should be an individual contributor for initial few months based on project movement team will be aligned
Strong in querying logic and data interpretation
Solid communication and articulation skills
Able to handle stakeholders independently with less interventions of reporting manager
Develop strategies to solve problems in logical yet creative ways
Create custom reports and presentations accompanied by strong data visualization and storytelling
• • General or Strong IT background, with at least 2 to 4 years of working experience
• o Strong understanding of data integration and ETL methodologies.
• o Demonstrated ability to multi-task
• o Excellent English communication skills
• o A desire to be a part of growing company. You'll have 2 core responsibilities (Client Work, and Company Building), and we expect dedication to both.
• o Willingness to learn and work on new technologies.
• o Should be a quick and self-learner.
1. Good Knowledge of Power Bi and Tableau
2. Good experience in handling data in Excel.
Punchh is the leader in customer loyalty, offer management, and AI solutions for offline and omni-channel merchants including restaurants, convenience stores, and retailers. Punchh brings the power of online to physical brands by delivering omni-channel experiences and personalization across the entire customer journey--from acquisition through loyalty and growth--to drive same store sales and customer lifetime value. Punchh uses best-in-class integrations to POS and other in-store systems such as WiFi, to deliver real-time SKU-level transaction visibility and offer provisioning for physical stores.
Punchh is growing exponentially, serves 200+ brands that encompass 91K+ stores globally. Punchh’s customers include the top convenience stores such as Casey’s General Stores, 25+ of the top 100 restaurant brands such as Papa John's, Little Caesars, Denny’s, Focus Brands (5 of 7 brands), and Yum! Brands (KFC, Pizza Hut, and Taco Bell), and retailers. For a multi-billion $ brand with 6K+ stores, Punchh drove a 3% lift in same-store sales within the first year. Punchh is powering loyalty programs for 135+ million consumers.
Punchh has raised $70 million from premier Silicon Valley investors including Sapphire Ventures and Adam Street Partners, has a seasoned leadership team with extensive experience in digital, marketing, CRM, and AI technologies as well as deep restaurant and retail industry expertise.
About the Role:
Punchh Tech India Pvt. is looking for a Senior Data Analyst – Business Insights to join our team. If you're excited to be part of a winning team, Punchh is a great place to grow your career.
This position is responsible for discovering the important trends among the complex data generated on Punchh platform, that have high business impact (influencing product features and roadmap). Creating hypotheses around these trends, validate them with statistical significance and make recommendations
Reporting to: Director, Analytics
Job Location: Jaipur
Experience Required: 4-6 years
What You’ll Do
- Take ownership of custom data analysis projects/requests and work closely with end users (both internal and external clients) to deliver the results
- Identify successful implementation/utilization of product features and contribute to the best-practices playbook for client facing teams (Customer Success)
- Strive towards building mini business intelligence products that add value to the client base
- Represent the company’s expertise in advanced analytics in a variety of media outlets such as client interactions, conferences, blogs, and interviews.
What You’ll Need
- Masters in business/behavioral economics/statistics with a strong interest in marketing technology
- Proven track record of at least 5 years uncovering business insights, especially related to Behavioral Economics and adding value to businesses
- Proficient in using the proper statistical and econometric approaches to establish the presence and strength of trends in data. Strong statistical knowledge is mandatory.
- Extensive prior exposure in causal inference studies, based on both longitudinal and latitudinal data.
- Excellent experience using Python (or R) to analyze data from extremely large or complex data sets
- Exceptional data querying skills (Snowflake/Redshift, Spark, Presto/Athena, to name a few)
- Ability to effectively articulate complex ideas in simple and effective presentations to diverse groups of stakeholders.
- Experience working with a visualization tool (preferably, but not restricted to Tableau)
- Domain expertise: extensive exposure to retail business, restaurant business or worked on loyalty programs and promotion/campaign effectiveness
- Should be self-organized and be able to proactively identify problems and propose solutions
- Gels well within and across teams, work with stakeholders from various functions such as Product, Customer Success, Implementations among others
- As the stakeholders on business side are based out of US, should be flexible to schedule meetings convenient to the West Coast timings
- Effective in working autonomously to get things done and taking the initiatives to anticipate needs of executive leadership
- Able and willing to relocate to Jaipur post pandemic.
- Medical Coverage, to keep you and your family healthy.
- Compensation that stacks up with other tech companies in your area.
- Paid vacation days and holidays to rest and relax.
- Healthy lunch provided daily to fuel you through your work.
- Opportunities for career growth and training support, including fun team building events.
- Flexibility and a comfortable work environment for you to feel your best.
We are looking for a Senior Database Developer to provide a senior-level contribution to design, develop and implement critical business enterprise applications for marketing systems
- Play a lead role in developing, deploying and managing our databases (Oracle, My SQL and Mongo) on Public Clouds.
- Design and develop PL/SQL processes to perform complex ETL processes.
- Develop UNIX and Perl scripts for data auditing and automation.
- Responsible for database builds and change requests.
- Holistically define the overall reference architecture and manage its overall implementation in the production systems.
- Identify architecture gaps that can improve availability, performance and security for both productions systems and database systems and works towards resolving those issues.
- Work closely with Engineering, Architecture, Business and Operations teams to provide necessary and continuous feedback.
- Automate all the manual steps for the database platform.
- Deliver solutions for access management, availability, security, replication and patching.
- Troubleshoot application database performance issues.
- Participate in daily huddles (30 min.) to collaborate with onshore and offshore teams.
- 5+ years of experience in database development.
- Bachelor’s degree in Computer Science, Computer Engineering, Math, or similar.
- Experience using ETL tools (Talend or Ab Initio a plus).
- Experience with relational database programming, processing and tuning (Oracle, PL/SQL, My SQL, MS SQL Server, SQL, TSQL).
- Familiarity with BI tools (Cognos, Tableau, etc.).
- Experience with Cloud technology (AWS, etc.).
- Agile or Waterfall methodology experience preferred.
- Experience with API integration.
- Advanced software development and scripting skills for use in automation and interfacing with databases.
- Knowledge of software development lifecycles and methodologies.
- Knowledge of developing procedures, packages and functions in a DW environment.
- Knowledge of UNIX, Linux and Service Oriented Architecture (SOA).
- Ability to multi-task, to work under pressure, and think analytically.
- Ability to work with minimal supervision and meet deadlines.
- Ability to write technical specifications and documents.
- Ability to communicate effectively with individuals at all levels in the company and with various business contacts outside of the company in an articulate, professional manner.
- Knowledge of CDP, CRM, MDM and Business Intelligence is a plus.
- Flexible work hours.
This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level.
Location: Chennai- Guindy Industrial Estate
Duration: Full time role
Company: Mobile Programming (https://www.mobileprogramming.com/" target="_blank">https://www.
Client Name: Samsung
We are looking for a Data Engineer to join our growing team of analytics experts. The hire will be
responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing
data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline
builder and data wrangler who enjoy optimizing data systems and building them from the ground up.
The Data Engineer will support our software developers, database architects, data analysts and data
scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout
ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple
teams, systems and products.
Responsibilities for Data Engineer
Create and maintain optimal data pipeline architecture,
Assemble large, complex data sets that meet functional / non-functional business requirements.
Identify, design, and implement internal process improvements: automating manual processes,
optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data
from a wide variety of data sources using SQL and AWS big data technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer
acquisition, operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with
data-related technical issues and support their data infrastructure needs.
Create data tools for analytics and data scientist team members that assist them in building and
optimizing our product into an innovative industry leader.
Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
Experience building and optimizing big data ETL pipelines, architectures and data sets.
Advanced working SQL knowledge and experience working with relational databases, query
authoring (SQL) as well as working familiarity with a variety of databases.
Experience performing root cause analysis on internal and external data and processes to
answer specific business questions and identify opportunities for improvement.
Strong analytic skills related to working with unstructured datasets.
Build processes supporting data transformation, data structures, metadata, dependency and
A successful history of manipulating, processing and extracting value from large disconnected
Working knowledge of message queuing, stream processing and highly scalable ‘big data’ data
Strong project management and organizational skills.
Experience supporting and working with cross-functional teams in a dynamic environment.
We are looking for a candidate with 3-6 years of experience in a Data Engineer role, who has
attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
Experience with big data tools: Spark, Kafka, HBase, Hive etc.
Experience with relational SQL and NoSQL databases
Experience with AWS cloud services: EC2, EMR, RDS, Redshift
Experience with stream-processing systems: Storm, Spark-Streaming, etc.
Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc.
Skills: Big Data, AWS, Hive, Spark, Python, SQL