
Key Responsibilities:
•Design, development, support and maintain automated business intelligence products in Tableau.
•Rapidly design, develop and implement reporting applications that insert KPI metrics and actionable insights into the operational, tactical and strategic activities of key business functions.
•Develop strong communication skills with a proven success communicating with users, other tech teams.
•Identify business requirements, design processes that leverage/adapt the business logic and regularly communicate with business stakeholders to ensure delivery meets business needs.
•Design, code and review business intelligence projects developed in tools Tableau & Power BI.
•Work as a member and lead teams to implement BI solutions for our customers.
•Develop dashboards and data sources that meet and exceed customer requirements.
•Partner with business information architects to understand the business use cases that support and fulfill business and data strategy.
•Partner with Product Owners and cross functional teams in a collaborative and agile environment
•Provide best practices for data visualization and Tableau implementations.
•Work along with solution architect in RFI / RFP response solution design, customer presentations, demonstrations, POCs etc. for growth.
Desired Candidate Profile:
•6-10 years of programming experience and a demonstrated proficiency in Experience with Tableau Certifications in Tableau is highly preferred.
•Ability to architect and scope complex projects.
•Strong understanding of SQL and basic understanding of programming languages; experience with SAQL, SOQL, Python, or R a plus.
•Applied experience in Agile development processes (SCRUM)
•Ability to independently learn new technologies.
•Ability to show initiative and work independently with minimal direction.
•Presentation skills – demonstrated ability to simplify complex situations and ideas and distill them into compelling and effective written and oral presentations.
•Learn quickly – ability to understand and rapidly comprehend new areas, functional and technical, and apply detailed and critical thinking to customer solutions.
Education:
•Bachelor/master’s degree in Computer Science, Computer Engineering, quantitative studies, such as Statistics, Math, Operation Research, Economics and Advanced Analytics

About Blend360
Similar jobs
Our client combines Adtech and Martech platform strategy with data science & data engineering expertise, helping our clients make advertising work better for people.
- Act as primary day-to-day contact on analytics to agency-client leads
- Develop bespoke analytics proposals for presentation to agencies & clients, for delivery within the teams
- Ensure delivery of projects and services across the analytics team meets our stakeholder requirements (time, quality, cost)
- Hands on platforms to perform data pre-processing that involves data transformation as well as data cleaning
- Ensure data quality and integrity
- Interpret and analyse data problems
- Build analytic systems and predictive models
- Increasing the performance and accuracy of machine learning algorithms through fine-tuning and further
- Visualize data and create reports
- Experiment with new models and techniques
- Align data projects with organizational goals
Requirements
- Min 6 - 7 years’ experience working in Data Science
- Prior experience as a Data Scientist within a digital media is desirable
- Solid understanding of machine learning
- A degree in a quantitative field (e.g. economics, computer science, mathematics, statistics, engineering, physics, etc.)
- Experience with SQL/ Big Query/GMP tech stack / Clean rooms such as ADH
- A knack for statistical analysis and predictive modelling
- Good knowledge of R, Python
- Experience with SQL, MYSQL, PostgreSQL databases
- Knowledge of data management and visualization techniques
- Hands-on experience on BI/Visual Analytics Tools like PowerBI or Tableau or Data Studio
- Evidence of technical comfort and good understanding of internet functionality desirable
- Analytical pedigree - evidence of having approached problems from a mathematical perspective and working through to a solution in a logical way
- Proactive and results-oriented
- A positive, can-do attitude with a thirst to continually learn new things
- An ability to work independently and collaboratively with a wide range of teams
- Excellent communication skills, both written and oral
Objective
Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products
Roles and Responsibilities:
- Should be comfortable in building and optimizing performant data pipelines which include data ingestion, data cleansing and curation into a data warehouse, database, or any other data platform using DASK/Spark.
- Experience in distributed computing environment and Spark/DASK architecture.
- Optimize performance for data access requirements by choosing the appropriate file formats (AVRO, Parquet, ORC etc) and compression codec respectively.
- Experience in writing production ready code in Python and test, participate in code reviews to maintain and improve code quality, stability, and supportability.
- Experience in designing data warehouse/data mart.
- Experience with any RDBMS preferably SQL Server and must be able to write complex SQL queries.
- Expertise in requirement gathering, technical design and functional documents.
- Experience in Agile/Scrum practices.
- Experience in leading other developers and guiding them technically.
- Experience in deploying data pipelines using automated CI/CD approach.
- Ability to write modularized reusable code components.
- Proficient in identifying data issues and anomalies during analysis.
- Strong analytical and logical skills.
- Must be able to comfortably tackle new challenges and learn.
- Must have strong verbal and written communication skills.
Required skills:
- Knowledge on GCP
- Expertise in Google BigQuery
- Expertise in Airflow
- Good Hands on SQL
- Data warehousing concepts
Our client is an innovative Fintech company that is revolutionizing the business of short term finance. The company is an online lending startup that is driven by an app-enabled technology platform to solve the funding challenges of SMEs by offering quick-turnaround, paperless business loans without collateral. It counts over 2 million small businesses across 18 cities and towns as its customers.
- Performing extensive analysis on SQL, Google Analytics & Excel from a product standpoint to provide quick recommendations to the management
- Establishing scalable, efficient and automated processes to deploy data analytics on large data sets across platforms
What you need to have:
- B.Tech /B.E.; Any Graduation
- Strong background in statistical concepts & calculations to perform analysis/ modeling
- Proficient in SQL and other BI tools like Tableau, Power BI etc.
- Good knowledge of Google Analytics and any other web analytics platforms (preferred)
- Strong analytical and problem solving skills to analyze large quantum of datasets
- Ability to work independently and bring innovative solutions to the team
- Experience of working with a start-up or a product organization (preferred)
Immediate Joiners Preferred
NOTE: Working Shift: 11 am - 8 pm IST (+/- one hour on need basis) Monday to Friday
Responsibilities :
1. Leverage Alteryx to build, maintain, execute and deliver data assets. This entails using data analytics tools (combination of proprietary tools, Excel, and scripting), validating raw data quality, and working with Tableau/ Power BI and other geospatial data visualization applications
2. Analyze large data sets in Alteryx, finding patterns and providing a concise summary
3. Implement, maintain, and troubleshoot analytics solution and derive key insights based on Tableau/ Power BI visualization and data analysis
4. Improve ETL setup and develop components so they can be rapidly deployed and configured
5. Conduct data assessment, perform data quality checks and transform data using SQL and ETL tools
Qualification :
1. 3+ years of relevant and professional work experience with a reputed analytics firm
2. Bachelor's degree in Engineering / Information Technology from a reputed college
3. Must have the knowledge to handle/design/optimize complex ETL using Alteryx
4. Expertise with visualization tools such as Tableau /Power BI to solve a business problem related to data exploration and visualization
5. Good knowledge in handling large amounts of data through SQL, T-SQL or PL-SQL
6. Basic knowledge in Python for data processing will be good to have
7. Very good understanding of data warehousing and data lake concepts
Senior Big Data Engineer
Note: Notice Period : 45 days
Banyan Data Services (BDS) is a US-based data-focused Company that specializes in comprehensive data solutions and services, headquartered in San Jose, California, USA.
We are looking for a Senior Hadoop Bigdata Engineer who has expertise in solving complex data problems across a big data platform. You will be a part of our development team based out of Bangalore. This team focuses on the most innovative and emerging data infrastructure software and services to support highly scalable and available infrastructure.
It's a once-in-a-lifetime opportunity to join our rocket ship startup run by a world-class executive team. We are looking for candidates that aspire to be a part of the cutting-edge solutions and services we offer that address next-gen data evolution challenges.
Key Qualifications
· 5+ years of experience working with Java and Spring technologies
· At least 3 years of programming experience working with Spark on big data; including experience with data profiling and building transformations
· Knowledge of microservices architecture is plus
· Experience with any NoSQL databases such as HBase, MongoDB, or Cassandra
· Experience with Kafka or any streaming tools
· Knowledge of Scala would be preferable
· Experience with agile application development
· Exposure of any Cloud Technologies including containers and Kubernetes
· Demonstrated experience of performing DevOps for platforms
· Strong Skillsets in Data Structures & Algorithm in using efficient way of code complexity
· Exposure to Graph databases
· Passion for learning new technologies and the ability to do so quickly
· A Bachelor's degree in a computer-related field or equivalent professional experience is required
Key Responsibilities
· Scope and deliver solutions with the ability to design solutions independently based on high-level architecture
· Design and develop the big data-focused micro-Services
· Involve in big data infrastructure, distributed systems, data modeling, and query processing
· Build software with cutting-edge technologies on cloud
· Willing to learn new technologies and research-orientated projects
· Proven interpersonal skills while contributing to team effort by accomplishing related results as needed
About Graphene
Graphene is a Singapore Head quartered AI company which has been recognized as Singapore’s Best
Start Up By Switzerland’s Seedstarsworld, and also been awarded as best AI platform for healthcare in Vivatech Paris. Graphene India is also a member of the exclusive NASSCOM Deeptech club. We are developing an AI plaform which is disrupting and replacing traditional Market Research with unbiased insights with a focus on healthcare, consumer goods and financial services.
Graphene was founded by Corporate leaders from Microsoft and P&G, and works closely with the Singapore Government & Universities in creating cutting edge technology which is gaining traction with many Fortune 500 companies in India, Asia and USA.
Graphene’s culture is grounded in delivering customer delight by recruiting high potential talent and providing an intense learning and collaborative atmosphere, with many ex-employees now hired by large companies across the world.
Graphene has a 6-year track record of delivering financially sustainable growth and is one of the rare start-ups which is self-funded and is yet profitable and debt free. We have already created a strong bench strength of Singaporean leaders and are recruiting and grooming more talent with a focus on our US expansion.
Job title: - Data Analyst
Job Description
Data Analyst responsible for storage, data enrichment, data transformation, data gathering based on data requests, testing and maintaining data pipelines.
Responsibilities and Duties
- Managing end to end data pipeline from data source to visualization layer
- Ensure data integrity; Ability to pre-empt data errors
- Organized managing and storage of data
- Provide quality assurance of data, working with quality assurance analysts if necessary.
- Commissioning and decommissioning of data sets.
- Processing confidential data and information according to guidelines.
- Helping develop reports and analysis.
- Troubleshooting the reporting database environment and reports.
- Managing and designing the reporting environment, including data sources, security, and metadata.
- Supporting the data warehouse in identifying and revising reporting requirements.
- Supporting initiatives for data integrity and normalization.
- Evaluating changes and updates to source production systems.
- Training end-users on new reports and dashboards.
- Initiate data gathering based on data requirements
- Analyse the raw data to check if the requirement is satisfied
Qualifications and Skills
- Technologies required: Python, SQL/ No-SQL database(CosmosDB)
- Experience required 2 – 5 Years. Experience in Data Analysis using Python
• Understanding of software development life cycle
- Plan, coordinate, develop, test and support data pipelines, document, support for reporting dashboards (PowerBI)
- Automation steps needed to transform and enrich data.
- Communicate issues, risks, and concerns proactively to management. Document the process thoroughly to allow peers to assist with support as needed.
- Excellent verbal and written communication skills
Mining large volumes of credit behavior data to generate insights around product holdings and monetization opportunities for cross sell
Use data science to size opportunity and product potential for launch of any new product/pilots
Build propensity models using heuristics and campaign performance to maximize efficiency.
Conduct portfolio analysis and establish key metrics for cross sell partnership
Desired profile/Skills:
2-5 years of experience with a degree in any quantitative discipline such as Engineering, Computer Science, Economics, Statistics or Mathematics
Excellent problem solving and comprehensive analytical skills – ability to structure ambiguous problem statements, perform detailed analysis and derive crisp insights.
Solid experience in using python and SQL
Prior work experience in a financial services space would be highly valued
Location: Bangalore/ Ahmedabad
- Build a team with skills in ETL, reporting, MDM and ad-hoc analytics support
- Build technical solutions using latest open source and cloud based technologies
- Work closely with offshore senior consultant, onshore team and client's business and IT teams to gather project requirements
- Assist overall project execution from India - starting from project planning, team formation system design and development, testing, UAT and deployment
- Build demos and POCs in support of business development for new and existing clients
- Prepare project documents and PowerPoint presentations for client communication
- Conduct training sessions to train associates and help shape their growth

