25+ Data Visualization Jobs in Bangalore (Bengaluru) | Data Visualization Job openings in Bangalore (Bengaluru)
Apply to 25+ Data Visualization Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Data Visualization Job opportunities across top companies like Google, Amazon & Adobe.
Role: Data Analyst (Apprentice - 1 Year contract)
Location: Bangalore, Hybrid Model
Job Summary: Join our team as an Apprentice for one year and take the next step in your career while supporting FOX’s unique corporate services and Business Intelligence (BI) platforms. This role offers a fantastic opportunity to leverage your communication skills, technical expertise, analytical and problem-solving competencies, and customer-focused experience.
Key Responsibilities:
· Assist in analyzing and solving data-related challenges.
· Support the development and maintenance of corporate service and BI platforms.
· Collaborate with cross-functional teams to enhance user experience and operational efficiency.
· Participate in training sessions to further develop technical and analytical skills.
· Conduct research and analysis to identify trends and insights in data.
· Prepare reports and presentations to communicate findings to stakeholders.
· Engage with employees to understand their needs and provide support.
· Contribute insights and suggestions during team meetings to drive continuous improvement.
Qualifications:
· Bachelor’s degree in Engineering (2024 pass-out).
· Strong analytical and technical skills with attention to detail.
· Excellent communication skills, both verbal and written.
· Ability to work collaboratively in a team-oriented environment.
· Proactive attitude and a strong willingness to learn.
· Familiarity with data analysis tools and software (e.g., Excel, SQL, Tableau) is a plus.
· Basic understanding of programming languages (e.g., Python, R) is an advantage.
Additional Information:
- This position offers a hybrid work model, allowing flexibility between remote and in-office work.
- Opportunities for professional development and skill enhancement through buddy and mentorship.
- Exposure to real-world projects and the chance to contribute to impactful solutions.
- A supportive and inclusive team environment that values diverse perspectives.
Are you passionate about pushing the boundaries of Artificial Intelligence and its applications in the software development lifecycle? Are you excited about building AI models that can revolutionize how developers ship, refactor, and onboard to legacy or existing applications faster? If so, Zevo.ai has the perfect opportunity for you!
As an AI Researcher/Engineer at Zevo.ai, you will play a crucial role in developing cutting-edge AI models using CodeBERT and codexGLUE to achieve our goal of providing an AI solution that supports developers throughout the sprint cycle. You will be at the forefront of research and development, harnessing the power of Natural Language Processing (NLP) and Machine Learning (ML) to revolutionize the way software development is approached.
Responsibilities:
- AI Model Development: Design, implement, and refine AI models utilizing CodeBERT and codexGLUE to comprehend codebases, facilitate code understanding, automate code refactoring, and enhance the developer onboarding process.
- Research and Innovation: Stay up-to-date with the latest advancements in NLP and ML research, identifying novel techniques and methodologies that can be applied to Zevo.ai's AI solution. Conduct experiments, perform data analysis, and propose innovative approaches to enhance model performance.
- Data Collection and Preparation: Collaborate with data engineers to identify, collect, and preprocess relevant datasets necessary for training and evaluating AI models. Ensure data quality, correctness, and proper documentation.
- Model Evaluation and Optimization: Develop robust evaluation metrics to measure the performance of AI models accurately. Continuously optimize and fine-tune models to achieve state-of-the-art results.
- Code Integration and Deployment: Work closely with software developers to integrate AI models seamlessly into Zevo.ai's platform. Ensure smooth deployment and monitor the performance of the deployed models.
- Collaboration and Teamwork: Collaborate effectively with cross-functional teams, including data scientists, software engineers, and product managers, to align AI research efforts with overall company objectives.
- Documentation: Maintain detailed and clear documentation of research findings, methodologies, and model implementations to facilitate knowledge sharing and future developments.
- Ethics and Compliance**: Ensure compliance with ethical guidelines and legal requirements related to AI model development, data privacy, and security.
Requirements
- Educational Background: Bachelor's/Master's or Ph.D. in Computer Science, Artificial Intelligence, Machine Learning, or a related field. A strong academic record with a focus on NLP and ML is highly desirable.
- Technical Expertise: Proficiency in NLP, Deep Learning, and experience with AI model development using frameworks like PyTorch or TensorFlow. Familiarity with CodeBERT and codexGLUE is a significant advantage.
- Programming Skills: Strong programming skills in Python and experience working with large-scale software projects.
- Research Experience: Proven track record of conducting research in NLP, ML, or related fields, demonstrated through publications, conference papers, or open-source contributions.
- Problem-Solving Abilities: Ability to identify and tackle complex problems related to AI model development and software engineering.
- Team Player: Excellent communication and interpersonal skills, with the ability to collaborate effectively in a team-oriented environment.
- Passion for AI: Demonstrated enthusiasm for AI and its potential to transform software development practices.
If you are eager to be at the forefront of AI research, driving innovation and impacting the software development industry, join Zevo.ai's talented team of experts as an AI Researcher/Engineer. Together, we'll shape the future of the sprint cycle and revolutionize how developers approach code understanding, refactoring, and onboarding!
Skills required
· Certified in Google Analytics (any popular certifier)
· A minimum of 4+ years of experience in the web analytics domain
· Consolidated knowledge of the GA4 platform
· Excellent communication skills (written & verbal)
· Good understanding of Analytics, Tag Manager, and Data Studio.
· Basic knowledge of BigQuery & JavaScript
· Familiar with HTML, CSS and Javascript, and will have the ability to read, reuse and customise code
· Advanced understanding of Events Tagging and Custom dimensions.
· Hands-on experience with data visualization (custom dashboards)
Roles and Responsibilities
· Responsible for managing our Analytics (G4) and GTM accounts
· Implementation of highly effective web data analytic solutions
· Visualize customer behavioral data such as page/path analysis, clickstream, funnel progression, CTA optimization, page/site abandonment, etc.
· Implement tagging and configuration that ensures data collection is accurate and adequate based on the business needs
In-depth knowledge in the following areas.
· Determine tracking requirements.
· Create functional and technical design (Dom Scrapping & Custom HTML) of the tags.
· Install GTM container tag on the live/staging websites.
· Create, publish and test tags on the live/staging websites.
· Able to track the successful form submission.
· Do regular Tag audits for the live/staging websites.
· Coordinate with the marketing heads to understand and develop custom reports and dashboards.
· Create specified digital analytics tagging standards to ensure robust and consistent data capturing.
· Testing and validating the tracking using various debugging tools.
Responsibilities
This role requires a person to support business charters & accompanying products by aligning with the Analytics
Manager’s vision, understanding tactical requirements and helping in successful execution. Split would be approx.
70% management + 30% individual contributor. Responsibilities include
Project Management
- Understand business needs and objectives.
- Refine use cases and plan iterations and deliverables - able to pivot as required.
- Estimate efforts and conduct regular task updates to ensure timeline adherence.
- Set and manage stakeholder expectations as required
Quality Execution
- Help BA and SBA resources with requirement gathering and final presentations.
- Resolve blockers regarding technical challenges and decision-making.
- Check final deliverables for correctness and review codes, along with Manager.
KPIs and metrics
- Orchestrate metrics building, maintenance, and performance monitoring.
- Owns and manages data models, data sources, and data definition repo.
- Makes low-level design choices during execution.
Team Nurturing
- Help Analytics Manager during regular one-on-ones + check-ins + recruitment.
- Provide technical guidance whenever required.
- Improve benchmarking and decision-making skills at execution-level.
- Train and get new resources up-to-speed.
- Knowledge building (methodologies) to better position the team for complex problems.
Communication
- Upstream to document and discuss execution challenges, process inefficiencies, and feedback loops.
- Downstream and parallel for context-building, mentoring, stakeholder management.
Analytics Stack
- Analytics : Python / R + SQL + Excel / PPT, Colab notebooks
- Database : PostgreSQL, Amazon Redshift, DynamoDB, Aerospike
- Warehouse : Amazon Redshift
- ETL : Lots of Python + custom-made
- Business Intelligence / Visualization : Metabase + Python/R libraries (location data)
- Deployment pipeline : Docker, Git, Jenkins, AWS Lambda
Responsibilities
- To Co-ordinate with regional teams and collect the required data
- Knowledge of advanced MS Excel, Presentation is a must
- Candidate must be able to handle large data sets and handle spreadsheets.
- Good at analyzing the data & Presentation
- Coordination with the team for timely submission of reports, tracking progress
- Understand the reporting model infrastructure and implement streamlined process solutions toward a more efficient reporting model.
- Preparation and analysis of daily, weekly, and monthly reports of different functions and summarizing the same
- Must be able to Generate/Update Reports, Create dashboards, Pivot tables
- Monthly/ Quarterly reconciliations and data cleansing to ensure the integrity of reported numbers.
- Review and investigate trends and provide commentary for management.
- Finding discrepancies in the process and arriving at a resolution
Eligibility
- Proven experience in MIS
- Ability to be self-motivated and self-directed and think and act independently while also being team oriented
- Good follow-up skills, the ability to understand, adaptability to process changes, and a strong sense of importance and ownership
- Good in numbers (Mathematics)
- Good in Excel
- Good communication skills
Personal skills:
Strong interpersonal and team working skills
Self-motivated that is able to work with minimal supervision
Demonstrate strong analytical and problem-solving skills
Ability to communicate effectively at all levels Strong attention to detail
Duties and Responsibilities:
Research and Develop Innovative Use Cases, Solutions and Quantitative Models
Quantitative Models in Video and Image Recognition and Signal Processing for cloudbloom’s
cross-industry business (e.g., Retail, Energy, Industry, Mobility, Smart Life and
Entertainment).
Design, Implement and Demonstrate Proof-of-Concept and Working Proto-types
Provide R&D support to productize research prototypes.
Explore emerging tools, techniques, and technologies, and work with academia for cutting-
edge solutions.
Collaborate with cross-functional teams and eco-system partners for mutual business benefit.
Team Management Skills
Academic Qualification
7+ years of professional hands-on work experience in data science, statistical modelling, data
engineering, and predictive analytics assignments
Mandatory Requirements: Bachelor’s degree with STEM background (Science, Technology,
Engineering and Management) with strong quantitative flavour
Innovative and creative in data analysis, problem solving and presentation of solutions.
Ability to establish effective cross-functional partnerships and relationships at all levels in a
highly collaborative environment
Strong experience in handling multi-national client engagements
Good verbal, writing & presentation skills
Core Expertise
Excellent understanding of basics in mathematics and statistics (such as differential
equations, linear algebra, matrix, combinatorics, probability, Bayesian statistics, eigen
vectors, Markov models, Fourier analysis).
Building data analytics models using Python, ML libraries, Jupyter/Anaconda and Knowledge
database query languages like SQL
Good knowledge of machine learning methods like k-Nearest Neighbors, Naive Bayes, SVM,
Decision Forests.
Strong Math Skills (Multivariable Calculus and Linear Algebra) - understanding the
fundamentals of Multivariable Calculus and Linear Algebra is important as they form the basis
of a lot of predictive performance or algorithm optimization techniques.
Deep learning : CNN, neural Network, RNN, tensorflow, pytorch, computervision,
Large-scale data extraction/mining, data cleansing, diagnostics, preparation for Modeling
Good applied statistical skills, including knowledge of statistical tests, distributions,
regression, maximum likelihood estimators, Multivariate techniques & predictive modeling
cluster analysis, discriminant analysis, CHAID, logistic & multiple regression analysis
Experience with Data Visualization Tools like Tableau, Power BI, Qlik Sense that help to
visually encode data
Excellent Communication Skills – it is incredibly important to describe findings to a technical
and non-technical audience
Capability for continuous learning and knowledge acquisition.
Mentor colleagues for growth and success
Strong Software Engineering Background
Hands-on experience with data science tools
GCP Data Analyst profile must have below skills sets :
- Knowledge of programming languages like https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.simplilearn.com%2Ftutorials%2Fsql-tutorial%2Fhow-to-become-sql-developer&data=05%7C01%7Ca_anjali%40hcl.com%7C4ae720b3f3cc45c3e04608da3346b335%7C189de737c93a4f5a8b686f4ca9941912%7C0%7C0%7C637878675987971859%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=EImfaJAD1KHOyrBQ7FkbaPl1STtfnf4QdQlbjw72%2BmE%3D&reserved=0" target="_blank">SQL, Oracle, R, MATLAB, Java and https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.simplilearn.com%2Fwhy-learn-python-a-guide-to-unlock-your-python-career-article&data=05%7C01%7Ca_anjali%40hcl.com%7C4ae720b3f3cc45c3e04608da3346b335%7C189de737c93a4f5a8b686f4ca9941912%7C0%7C0%7C637878675987971859%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=Z2n1Xy%2F3YN6nQqSweU5T7EfUTa1kPAAjbCMTWxDCh%2FY%3D&reserved=0" target="_blank">Python
- Data cleansing, data visualization, data wrangling
- Data modeling , data warehouse concepts
- Adapt to Big data platform like Hadoop, Spark for stream & batch processing
- GCP (Cloud Dataproc, Cloud Dataflow, Cloud Datalab, Cloud Dataprep, BigQuery, Cloud Datastore, Cloud Datafusion, Auto ML etc)
4-6 years of total experience in data warehousing and business intelligence
3+ years of solid Power BI experience (Power Query, M-Query, DAX, Aggregates)
2 years’ experience building Power BI using cloud data (Snowflake, Azure Synapse, SQL DB, data lake)
Strong experience building visually appealing UI/UX in Power BI
Understand how to design Power BI solutions for performance (composite models, incremental refresh, analysis services)
Experience building Power BI using large data in direct query mode
Expert SQL background (query building, stored procedure, optimizing performance)
Job Role & Responsibility:
VMWare NSBU HCX team is looking for passionate engineers with a startup mindset and ready to solve business problems by applying technical engineering knowledge. You will be a part of enterprise class HCX cloud mobility platform to help customers solve datacenter evacuation, consolidation and hybrid cloud use cases. You’ll be part of a dynamic engineering team where a passion for innovation is a key to address technical problems, improve performance and scalability and focus on continuous improvement is of paramount importance. You will be a part of R&D development team focused on various aspects of HCX ranging from workload mobility to networking in the context of hybrid cloud paradigm. As part of the job, you will work with senior architects and other engineers of the team to deliver world class enterprise class product.
Required Skills:
- Background with Computer Science fundamentals (based on a BS or MS in CS or related field) with 4+ years of substantial professional experience
- Strong programming skills in Java/Go/Python
- Knowledge of distributed systems
- Understanding of Micro Services architecture, REST APIs Design and Development
- Understanding of Kubernetes, Kafka, No-sql, Java/Sprint, Client MVC
- Exposure to one or more UI technologies
- Organized and passionate about details; able to effectively perform multiple/concurrent tasks within deadlines in a dynamic environment
Preferred Skills:
- Strong knowledge about virtualization, and/or container technologies
- Experience in SDN/Networking/Network Management domain is added plus
- Exposure to AWS, Azure is added plus
A global provider of Business Process Management company
Power BI Developer
Senior visualization engineer with 5 years’ experience in Power BI to develop and deliver solutions that enable delivery of information to audiences in support of key business processes. In addition, Hands-on experience on Azure data services like ADF and databricks is a must.
Ensure code and design quality through execution of test plans and assist in development of standards & guidelines working closely with internal and external design, business, and technical counterparts.
Candidates should have worked in agile development environments.
Desired Competencies:
- Should have minimum of 3 years project experience using Power BI on Azure stack.
- Should have good understanding and working knowledge of Data Warehouse and Data Modelling.
- Good hands-on experience of Power BI
- Hands-on experience T-SQL/ DAX/ MDX/ SSIS
- Data Warehousing on SQL Server (preferably 2016)
- Experience in Azure Data Services – ADF, DataBricks & PySpark
- Manage own workload with minimum supervision.
- Take responsibility of projects or issues assigned to them
- Be personable, flexible and a team player
- Good written and verbal communications
- Have a strong personality who will be able to operate directly with users
Purchase of compelling goods and services. (SG1)
We are looking for a Senior Data Analyst who will support our product, sales, leadership, and marketing teams with data collection and organization of data in order to draw conclusions, make predictions, and drive informed decision making.
- The ideal candidate is adept at using large data sets to find opportunities for product and marketing optimization and using data models to test the effectiveness of different courses of action.
- Not just that, you also understand how to clean and organize data for complete analysis and calculations using spreadsheets, SQL Python, and R programming.
- Visualizing and presenting data findings in dashboards, presentations and commonly used visualization platforms must be your strength.
SKILLS YOU MUST HAVE:
- Data Cleansing, Data Analysis, Data Visualization (DataViz), SQL Questioning, Decision-Making, Problem Solving, Data Collection, Data Ethics, Sample Size Determination, Spreadsheet,
RESPONSIBILITIES:
- Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions.
- Mine and analyze data from company databases (GOOGLE ANALYTICS, User DB, and 3rd party Marketing Analytics tools) to drive optimization and improvement of product development, marketing techniques, and business strategies.
- Assess the effectiveness and accuracy of new data sources and data gathering techniques.
- Strong understanding of Google Analytics (GA) fundamentals and how to work with raw GA data
- Automate custom data fetching from Google Analytics dashboard to excel sheet and other primary and secondary sources
- Work with complex SQL queries
- Work with Big Query with GA integration to extract data
- Preparing final analysis reports for the stakeholders to understand the data-analysis steps, enabling them to take important decisions based on various facts and trends.
- Filter Data by reviewing reports and performance indicators to identify and correct code problems
- Developing and maintaining databases, data systems reorganizing data in a readable format
- Using statistical tools to identify, analyze, and interpret patterns and trends in complex data sets could be helpful for the diagnosis and prediction
- Assign a numerical value to essential business functions so that business performance can be assessed and compared over periods of time.
- Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes.
- Develop company A/B testing framework and test model quality.
- Coordinate with different functional teams to implement models and monitor outcomes.
- Develop processes and tools to monitor and analyze model performance and data accuracy.
- Understanding of statistical concepts like Statistical significance, t-test, and z-test for AB testing
- Performing analysis to assess quality and meaning of data
- Preparing reports for the management stating trends, patterns, and predictions using relevant data
- Working with programmers, engineers, and management heads to identify process improvement opportunities, propose system modifications, and devise data governance strategies.
- Robust understanding of Mix panel and Amplitude
Skill requirement:
- 6-8 years of experience in database management and a strong understanding of Google Analytics
- Strong mathematical skills to help collect, measure, organize, and analyze data
- Experience using statistical computer languages (R, Python, MATLAB, SQL, etc.) to manipulate data and draw insights from large data sets.
- Experience analyzing data from 3rd party providers: Google Analytics, Site Catalyst, Coremetrics, Adwords, Crimson Hexagon, Facebook Insights, etc.
- Technical proficiency regarding database design development, data models, techniques for data mining, and segmentation.
- Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks.
- Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications.
- Knowledge of data visualization software like Tableau
- Knowledge of how to create and apply the most accurate algorithms to datasets in order to find solutions
- Accuracy and attention to detail
- Adept at queries, advanced excel management, writing reports, and making presentations
- Team-working skills
- Verbal and Written communication skills
- Proven working experience in data analysis
Responsibilities
- Understanding the business requirements so as to formulate the problems to solve and restrict the slice of data to be explored.
- Collecting data from various sources.
- Performing cleansing, processing, and validation on the data subject to analyze, in order to ensure its quality.
- Exploring and visualizing data.
- Performing statistical analysis and experiments to derive business insights.
- Clearly communicating the findings from the analysis to turn information into something actionable through reports, dashboards, and/or presentations.
- Preparing business dashboards for teams to add transparency in the process and uncover bottlenecks
- Conceive and prepare product dashboards to highlight transparently the user journey on the BitClass platform and outline bottlenecks/wins in the same.
Skills
- Experience solving problems in the project’s business domain.
- Experience with data integration from multiple sources
- Proficiency in at least one query language, especially SQL.
- Working experience with NoSQL databases, such as MongoDB and Elasticsearch.
- Working experience with popular statistical and machine learning techniques, such as clustering, linear regression, KNN, decision trees, etc.
- Good scripting skills using Python, R or any other relevant language
- Proficiency in at least one data visualization tool, such as Matplotlib, Plotly, D3.js, ggplot, etc.
- Great communication skills.
at Velocity Services
We are an early stage start-up, building new fintech products for small businesses. Founders are IIT-IIM alumni, with prior experience across management consulting, venture capital and fintech startups. We are driven by the vision to empower small business owners with technology and dramatically improve their access to financial services. To start with, we are building a simple, yet powerful solution to address a deep pain point for these owners: cash flow management. Over time, we will also add digital banking and 1-click financing to our suite of offerings.
We have developed an MVP which is being tested in the market. We have closed our seed funding from marquee global investors and are now actively building a world class tech team. We are a young, passionate team with a strong grip on this space and are looking to on-board enthusiastic, entrepreneurial individuals to partner with us in this exciting journey. We offer a high degree of autonomy, a collaborative fast-paced work environment and most importantly, a chance to create unparalleled impact using technology.
Reach out if you want to get in on the ground floor of something which can turbocharge SME banking in India!
Technology stack at Velocity comprises a wide variety of cutting edge technologies like, NodeJS, Ruby on Rails, Reactive Programming,, Kubernetes, AWS, NodeJS, Python, ReactJS, Redux (Saga) Redis, Lambda etc.
Key Responsibilities
-
Responsible for building data and analytical engineering pipelines with standard ELT patterns, implementing data compaction pipelines, data modelling and overseeing overall data quality
-
Work with the Office of the CTO as an active member of our architecture guild
-
Writing pipelines to consume the data from multiple sources
-
Writing a data transformation layer using DBT to transform millions of data into data warehouses.
-
Implement Data warehouse entities with common re-usable data model designs with automation and data quality capabilities
-
Identify downstream implications of data loads/migration (e.g., data quality, regulatory)
What To Bring
-
3+ years of software development experience, a startup experience is a plus.
-
Past experience of working with Airflow and DBT is preferred
-
2+ years of experience working in any backend programming language.
-
Strong first-hand experience with data pipelines and relational databases such as Oracle, Postgres, SQL Server or MySQL
-
Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development)
-
Experienced with the formulation of ideas; building proof-of-concept (POC) and converting them to production-ready projects
-
Experience building and deploying applications on on-premise and AWS or Google Cloud cloud-based infrastructure
-
Basic understanding of Kubernetes & docker is a must.
-
Experience in data processing (ETL, ELT) and/or cloud-based platforms
-
Working proficiency and communication skills in verbal and written English.
a global business process management company
Power BI Developer(Azure Developer )
Job Description:
Senior visualization engineer with understanding in Azure Data Factory & Databricks to develop and deliver solutions that enable delivery of information to audiences in support of key business processes.
Ensure code and design quality through execution of test plans and assist in development of standards & guidelines working closely with internal and external design, business and technical counterparts.
Desired Competencies:
- Strong designing concepts of data visualization centered on business user and a knack of communicating insights visually.
- Ability to produce any of the charting methods available with drill down options and action-based reporting. This includes use of right graphs for the underlying data with company themes and objects.
- Publishing reports & dashboards on reporting server and providing role-based access to users.
- Ability to create wireframes on any tool for communicating the reporting design.
- Creation of ad-hoc reports & dashboards to visually communicate data hub metrics (metadata information) for top management understanding.
- Should be able to handle huge volume of data from databases such as SQL Server, Synapse, Delta Lake or flat files and create high performance dashboards.
- Should be good in Power BI development
- Expertise in 2 or more BI (Visualization) tools in building reports and dashboards.
- Understanding of Azure components like Azure Data Factory, Data lake Store, SQL Database, Azure Databricks
- Strong knowledge in SQL queries
- Must have worked in full life-cycle development from functional design to deployment
- Intermediate understanding to format, process and transform data
- Should have working knowledge of GIT, SVN
- Good experience in establishing connection with heterogeneous sources like Hadoop, Hive, Amazon, Azure, Salesforce, SAP, HANA, API’s, various Databases etc.
- Basic understanding of data modelling and ability to combine data from multiple sources to create integrated reports
Preferred Qualifications:
- Bachelor's degree in Computer Science or Technology
- Proven success in contributing to a team-oriented environment
product insights that will drive product
strategy and roadmap
Develop dashboards and define metrics that
inform success for the Product Team
Help design, execute and evaluate A/B tests to
improve the user journey
Explore large, complex, and loosely defined
datasets to create actionable insights
Facilitate changes to product features to
improve competitive position and optimal
product performance. Work collaboratively with various teams
including marketing, business, UX, customer
support and engineering teams etc.
1-3 years of analyst experience at a product
company. Undergraduate degree from Tier 1 Colleges (IIT
/ BITS / NIT)
Comfortable diving into data and deriving
tangible insights
Experience with product analytics tools like
Amplitude/Clevertap, and data visualization
tools like PowerBI/Tableau is a plus
Experience with running A/B tests and
experiments
Prior Startup and Fintech experience is a plus
Prior startup experience is a plus
About the role
- Collaborating with a team of like-minded and experienced engineers for Tier 1 customers, you will focus on data engineering on large complex data projects. Your work will have an impact on platforms that handle crores of customers and millions of transactions daily.
- As an engineer, you will use the latest cloud services to design and develop reusable core components and frameworks to modernise data integrations in a cloud first world and own those integrations end to end working closely with business units. You will design and build for efficiency, reliability, security and scalability. As a consultant, you will help drive a data engineering culture and advocate best practices.
Mandatory experience
- 1-6 years of relevant experience
- Strong SQL skills and data literacy
- Hands-on experience designing and developing data integrations, either in ETL tools, cloud native tools or in custom software
- Proficiency in scripting and automation (e.g. PowerShell, Bash, Python)
- Experience in an enterprise data environment
- Strong communication skills
Desirable experience
- Ability to work on data architecture, data models, data migration, integration and pipelines
- Ability to work on data platform modernisation from on-premise to cloud-native
- Proficiency in data security best practices
- Stakeholder management experience
- Positive attitude with the flexibility and ability to adapt to an ever-changing technology landscape
- Desire to gain breadth and depth of technologies to support customer's vision and project objectives
What to expect if you join Servian?
- Learning & Development: We invest heavily in our consultants and offer internal training weekly (both technical and non-technical alike!) and abide by a ‘You Pass We Pay” policy.
- Career progression: We take a longer term view of every hire. We have a flat org structure and promote from within. Every hire is developed as a future leader and client adviser.
- Variety of projects: As a consultant, you will have the opportunity to work across multiple projects across our client base significantly increasing your skills and exposure in the industry.
- Great culture: Working on the latest Apple MacBook pro in our custom designed offices in the heart of leafy Jayanagar, we provide a peaceful and productive work environment close to shops, parks and metro station.
- Professional development: We invest heavily in professional development both technically, through training and guided certification pathways, and in consulting, through workshops in client engagement and communication. Growth in our organisation happens from the growth of our people.
Work closely with different Front Office and Support Function stakeholders including but not restricted to Business
Management, Accounts, Regulatory Reporting, Operations, Risk, Compliance, HR on all data collection and reporting use cases.
Collaborate with Business and Technology teams to understand enterprise data, create an innovative narrative to explain, engage and enlighten regular staff members as well as executive leadership with data-driven storytelling
Solve data consumption and visualization through data as a service distribution model
Articulate findings clearly and concisely for different target use cases, including through presentations, design solutions, visualizations
Perform Adhoc / automated report generation tasks using Power BI, Oracle BI, Informatica
Perform data access/transfer and ETL automation tasks using Python, SQL, OLAP / OLTP, RESTful APIs, and IT tools (CFT, MQ-Series, Control-M, etc.)
Provide support and maintain the availability of BI applications irrespective of the hosting location
Resolve issues escalated from Business and Functional areas on data quality, accuracy, and availability, provide incident-related communications promptly
Work with strict deadlines on high priority regulatory reports
Serve as a liaison between business and technology to ensure that data related business requirements for protecting sensitive data are clearly defined, communicated, and well understood, and considered as part of operational
prioritization and planning
To work for APAC Chief Data Office and coordinate with a fully decentralized team across different locations in APAC and global HQ (Paris).
General Skills:
Excellent knowledge of RDBMS and hands-on experience with complex SQL is a must, some experience in NoSQL and Big Data Technologies like Hive and Spark would be a plus
Experience with industrialized reporting on BI tools like PowerBI, Informatica
Knowledge of data related industry best practices in the highly regulated CIB industry, experience with regulatory report generation for financial institutions
Knowledge of industry-leading data access, data security, Master Data, and Reference Data Management, and establishing data lineage
5+ years experience on Data Visualization / Business Intelligence / ETL developer roles
Ability to multi-task and manage various projects simultaneously
Attention to detail
Ability to present to Senior Management, ExCo; excellent written and verbal communication skills
Ganit has flipped the data science value chain as we do not start with a technique but for us, consumption comes first. With this philosophy, we have successfully scaled from being a small start-up to a 200 resource company with clients in the US, Singapore, Africa, UAE, and India.
We are looking for experienced data enthusiasts who can make the data talk to them.
You will:
- Understand business problems and translate business requirements into technical requirements.
- Conduct complex data analysis to ensure data quality & reliability i.e., make the data talk by extracting, preparing, and transforming it.
- Identify, develop and implement statistical techniques and algorithms to address business challenges and add value to the organization.
- Gather requirements and communicate findings in the form of a meaningful story with the stakeholders
- Build & implement data models using predictive modelling techniques. Interact with clients and provide support for queries and delivery adoption.
- Lead and mentor data analysts.
We are looking for someone who has:
- Apart from your love for data and ability to code even while sleeping you would need the following.
- Minimum of 02 years of experience in designing and delivery of data science solutions.
- You should have successful projects of retail/BFSI/FMCG/Manufacturing/QSR in your kitty to show-off.
- Deep understanding of various statistical techniques, mathematical models, and algorithms to start the conversation with the data in hand.
- Ability to choose the right model for the data and translate that into a code using R, Python, VBA, SQL, etc.
- Bachelors/Masters degree in Engineering/Technology or MBA from Tier-1 B School or MSc. in Statistics or Mathematics
Skillset Required:
- Regression
- Classification
- Predictive Modelling
- Prescriptive Modelling
- Python
- R
- Descriptive Modelling
- Time Series
- Clustering
What is in it for you:
- Be a part of building the biggest brand in Data science.
- An opportunity to be a part of a young and energetic team with a strong pedigree.
- Work on awesome projects across industries and learn from the best in the industry, while growing at a hyper rate.
Please Note:
At Ganit, we are looking for people who love problem solving. You are encouraged to apply even if your experience does not precisely match the job description above. Your passion and skills will stand out and set you apart—especially if your career has taken some extraordinary twists and turns over the years. We welcome diverse perspectives, people who think rigorously and are not afraid to challenge assumptions in a problem. Join us and punch above your weight!
Ganit is an equal opportunity employer and is committed to providing a work environment that is free from harassment and discrimination.
All recruitment, selection procedures and decisions will reflect Ganit’s commitment to providing equal opportunity. All potential candidates will be assessed according to their skills, knowledge, qualifications, and capabilities. No regard will be given to factors such as age, gender, marital status, race, religion, physical impairment, or political opinions.
● Ability to do exploratory analysis: Fetch data from systems and analyze trends.
● Developing customer segmentation models to improve the efficiency of marketing and product
campaigns.
● Establishing mechanisms for cross functional teams to consume customer insights to improve
engagement along the customer life cycle.
● Gather requirements for dashboards from business, marketing and operations stakeholders.
● Preparing internal reports for executive leadership and supporting their decision making.
● Analyse data, derive insights and embed it into Business actions.
● Work with cross functional teams.
Skills Required
• Data Analytics Visionary.
• Strong in SQL & Excel and good to have experience in Tableau.
• Experience in the field of Data Analysis, Data Visualization.
• Strong in analysing the Data and creating dashboards.
• Strong in communication, presentation and business intelligence.
• Multi-Dimensional, "Growth Hacker" Skill Set with strong sense of ownership for work.
• Aggressive “Take no prisoners” approach.
- Understand the business drivers and analytical use-cases.
- Translate use cases to data models, descriptive, analytical, predictive, and engineering outcomes.
- Explore new technologies and learn new techniques to solve business problems creatively
- Think big! and drive the strategy for better data quality for the customers.
- Become the voice of business within engineering and of engineering within the business with customers.
- Collaborate with many teams - engineering and business, to build better data products and services
- Deliver the projects along with the team collaboratively and manage updates to customers on time
What we're looking for :
- Hands-on experience in data modeling, data visualization, and pipeline design and development
- Hands-on exposure to Machine learning concepts like supervised learning, unsupervised learning, RNN, DNN.
- Prior experience working with business stakeholders, in an enterprise space is a plus
- Great communication skills. You should be able to directly communicate with senior business leaders, embed yourself with business teams, and present solutions to business stakeholders
- Experience in working independently and driving projects end to end, strong analytical skills.
A Services company
Work Location: Bangalore
Shift: Day Time
Primary Skills & Responsibilities
• Strong knowledge in Power BI (DAX + Power Query + Power BI Service + Power BI Desktop Visualisations) and Azure Data Storages.
- Expecting a minimum of 2-4 years of relevant experience
- You will be managing a team of 3 currently
- Take up the ownership of developing and managing one of the largest and richest food (recipe, menu, and CPG) databases
- Interactions with cross-functional teams (Business, Food Science, Product, and Tech) on a regular basis to pan the future of client and internal food data management
- Should have a natural flair for playing with numbers and data and have a keen eye for detail and quality
- Will spearhead the Ops team in achieving the targets while maintaining a staunch attentiveness to Coverage, Completeness, and Quality of the data
- Shall program and manage projects while identifying opportunities to optimize costs and processes.
- Good business acumen, in creating logic & process flows, quick and smart decision-making skills are expected
- Will also be responsible for the recruitment, induction and training new members as well
- Setting competitive team targets. Guide and support the team members to go the extra mile and achieve set targets
Added Advantages :
- Experience in a Food Sector / Insights company
- Has a passion for exploring different cuisines
- Understands industry-related jargons and has a natural flair towards learning more about anything related to food
- Proficient in Python 2 / 3 and have built products before (managing the entire DevOps cycle) - proficient in Python Data Structures
- Good understanding of Open Source Visualisation Tools like D3.JS, Matplotlib etc.
- Good understanding of Functional APIs and Data Ingestion viz. Pandas’ dataframe from SQL DB and Excel (CSV) & Flat files of (primarily) structured data - no data cleansing
- Growth mindset - eagerness to learn and grow in the role
- Qualifications: B.Tech / BE / Graduate Engineer in any stream (preferred Mech / Industrial Engineer)
- Compensation : At par with industry (with stock options / grants to be vested over 3 years)