Cutshort logo
Tableau developer
3 - 8 yrs
₹5L - ₹12L / yr
Chennai, Bengaluru (Bangalore)
Skills
Tableau
SQL
PL/SQL
Responsibilities


In this role, candidates will be responsible for developing Tableau Reports. Should be able to write effective and scalable code. Improve functionality of existing Reports/systems.

·       Design stable, scalable code.

·       Identify potential improvements to the current design/processes.

·       Participate in multiple project discussions as a senior member of the team.

·       Serve as a coach/mentor for junior developers.


Minimum Qualifications

·       3 - 8 Years of experience

·       Excellent written and verbal communication skills

 

Must have skills

·       Meaningful work experience

·       Extensively worked on BI Reporting tool: Tableau for development of reports to fulfill the end user requirements.

·       Experienced in interacting with business users to analyze the business process and requirements and redefining requirements into visualizations and reports.

·       Must have knowledge with the selection of appropriate data visualization strategies (e.g., chart types) for specific use cases. Ability to showcase complete dashboard implementations that demonstrate       visual  standard methodologies (e.g., color themes, visualization layout, interactivity, drill-down capabilities, filtering, etc.).

·       You should be an Independent player and have experience working with senior leaders.

·       Able to explore options and suggest new solutions and visualization techniques to the customer.

·       Experience crafting joins and joins with custom SQL blending data from different data sources using Tableau Desktop.

·       Using sophisticated calculations using Tableau Desktop (Aggregate, Date, Logical, String, Table, LOD Expressions.

·       Working with relational data sources (like Oracle / SQL Server / DB2) and flat files.

·       Optimizing user queries and dashboard performance.

·       Knowledge in SQL, PL/SQL.

·       Knowledge is crafting DB views and materialized views.

·       Excellent verbal and written communication skills and interpersonal skills are required.

·       Excellent documentation and presentation skills; should be able to build business process mapping document; functional solution documents and own the acceptance/signoff process from E2E

·       Ability to make right graph choices, use of data blending feature, Connect to several DB technologies.

·       Must stay up to date on new and coming visualization technologies. 

 

Pref location: Chennai (priority)/ Bengaluru  

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About IT Consulting, System Integrator & Software Services Company

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

DeepIntent
at DeepIntent
2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
3 - 5 yrs
Best in industry
PySpark
Data engineering
Big Data
Hadoop
Spark
+5 more

About DeepIntent:

DeepIntent is a marketing technology company that helps healthcare brands strengthen communication with patients and healthcare professionals by enabling highly effective and performant digital advertising campaigns. Our healthcare technology platform, MarketMatch™, connects advertisers, data providers, and publishers to operate the first unified, programmatic marketplace for healthcare marketers. The platform’s built-in identity solution matches digital IDs with clinical, behavioural, and contextual data in real-time so marketers can qualify 1.6M+ verified HCPs and 225M+ patients to find their most clinically-relevant audiences and message them on a one-to-one basis in a privacy-compliant way. Healthcare marketers use MarketMatch to plan, activate, and measure digital campaigns in ways that best suit their business, from managed service engagements to technical integration or self-service solutions. DeepIntent was founded by Memorial Sloan Kettering alumni in 2016 and acquired by Propel Media, Inc. in 2017. We proudly serve major pharmaceutical and Fortune 500 companies out of our offices in New York, Bosnia and India.


What You’ll Do:

  • Establish formal data practice for the organisation.
  • Build & operate scalable and robust data architectures.
  • Create pipelines for the self-service introduction and usage of new data
  • Implement DataOps practices
  • Design, Develop, and operate Data Pipelines which support Data scientists and machine learning
  • Engineers.
  • Build simple, highly reliable Data storage, ingestion, and transformation solutions which are easy
  • to deploy and manage.
  • Collaborate with various business stakeholders, software engineers, machine learning
  • engineers, and analysts.

Who You Are:

  • Experience in designing, developing and operating configurable Data pipelines serving high
  • volume and velocity data.
  • Experience working with public clouds like GCP/AWS.
  • Good understanding of software engineering, DataOps, data architecture, Agile and
  • DevOps methodologies.
  • Experience building Data architectures that optimize performance and cost, whether the
  • components are prepackaged or homegrown
  • Proficient with SQL, Java, Spring boot, Python or JVM-based language, Bash
  • Experience with any of Apache open source projects such as Spark, Druid, Beam, Airflow
  • etc. and big data databases like BigQuery, Clickhouse, etc
  • Good communication skills with the ability to collaborate with both technical and non-technical
  • people.
  • Ability to Think Big, take bets and innovate, Dive Deep, Bias for Action, Hire and Develop the Best, Learn and be Curious

 

Read more
Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
ETL
Informatica
Data Warehouse (DWH)
PowerBI
databricks
+4 more

About The Company


 The client is 17-year-old Multinational Company headquartered in Bangalore, Whitefield, and having another delivery center in Pune, Hinjewadi. It also has offices in US and Germany and are working with several OEM’s and Product Companies in about 12 countries and is a 200+ strong team worldwide. 


The Role


Power BI front-end developer in the Data Domain (Manufacturing, Sales & Marketing, Purchasing, Logistics, …).Responsible for the Power BI front-end design, development, and delivery of highly visible data-driven applications in the Compressor Technique. You always take a quality-first approach where you ensure the data is visualized in a clear, accurate, and user-friendly manner. You always ensure standards and best practices are followed and ensure documentation is created and maintained. Where needed, you take initiative and make

recommendations to drive improvements. In this role you will also be involved in the tracking, monitoring and performance analysis

of production issues and the implementation of bugfixes and enhancements.


Skills & Experience


• The ideal candidate has a degree in Computer Science, Information Technology or equal through experience.

• Strong knowledge on BI development principles, time intelligence, functions, dimensional modeling and data visualization is required.

• Advanced knowledge and 5-10 years experience with professional BI development & data visualization is preferred.

• You are familiar with data warehouse concepts.

• Knowledge on MS Azure (data lake, databricks, SQL) is considered as a plus.

• Experience and knowledge on scripting languages such as PowerShell and Python to setup and automate Power BI platform related activities is an asset.

• Good knowledge (oral and written) of English is required.

Read more
Bengaluru (Bangalore)
2 - 8 yrs
₹4L - ₹10L / yr
Data governance
Data security
skill iconData Analytics
Informatica
SQL
+4 more

Job Description

We are looking for a senior resource with Analyst skills and knowledge of IT projects, to support delivery of risk mitigation activities and automation in Aviva’s Global Finance Data Office. The successful candidate will bring structure to this new role in a developing team, with excellent communication, organisational and analytical skills. The Candidate will play the primary role of supporting data governance project/change activities. Candidates should be comfortable with ambiguity in a fast-paced and ever-changing environment. Preferred skills include knowledge of Data Governance, Informatica Axon, SQL, AWS. In our team, success is measured by results and we encourage flexible working where possible.

Key Responsibilities

  • Engage with stakeholders to drive delivery of the Finance Data Strategy
  • Support data governance project/change activities in Aviva’s Finance function.
  • Identify opportunities and implement Automations for enhanced performance of the Team

Required profile

  • Relevant work experience in at least one of the following: business/project analyst, project/change management and data analytics.
  • Proven track record of successful communication of analytical outcomes, including an ability to effectively communicate with both business and technical teams.
  • Ability to manage multiple, competing priorities and hold the team and stakeholders to account on progress.
  • Contribute, plan and execute end to end data governance framework.
  • Basic knowledge of IT systems/projects and the development lifecycle.
  • Experience gathering business requirements and reports.
  • Advanced experience of MS Excel data processing (VBA Macros).
  • Good communication

 

Additional Information

Degree in a quantitative or scientific field (e.g. Engineering, MBA Finance, Project Management) and/or experience in data governance/quality/privacy
Knowledge of Finance systems/processes
Experience in analysing large data sets using dedicated analytics tools

 

Designation – Assistant Manager TS

Location – Bangalore

Shift – 11 – 8 PM
Read more
Celebal Technologies Pvt Ltd
at Celebal Technologies Pvt Ltd
2 candid answers
Anjani Upadhyay
Posted by Anjani Upadhyay
Jaipur
3 - 6 yrs
₹10L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SQL
NOSQL Databases
+6 more

Job Description: 

An Azure Data Engineer is responsible for designing, implementing, and maintaining pipelines and ETL/ ELT flow solutions on the Azure cloud platform. This role requires a strong understanding of migration database technologies and the ability to deploy and manage database solutions in the Azure cloud environment.

 

Key Skills:

·      Min. 3+ years of Experience with data modeling, data warehousing, and building ETL pipelines.

·      Must have a firm knowledge of SQL, NoSQL, SSIS SSRS, and ETL/ELT Concepts.

·      Should have hands-on experience in Databricks, ADF (Azure Data Factory), ADLS, Cosmos DB.

·      Excel in the design, creation, and management of very large datasets

·      Detailed knowledge of cloud-based data warehouses, architecture, infrastructure components, ETL, and reporting analytics tools and environments.

·      Skilled with writing, tuning, and troubleshooting SQL queries

·      Experience with Big Data technologies such as Data storage, Data mining, Data analytics, and Data visualization.

·      Should be familiar with programming and should be able to write and debug the code in any of the programming languages like Node, Python, C#, .Net, Java.

 

Technical Expertise and Familiarity:

  • Cloud Technologies: Azure (ADF, ADB, Logic Apps, Azure SQL database, Azure Key Vaults, ADLS, Synapse)
  • Database: CosmosDB, Document DB  
  • IDEs: Visual Studio, VS Code, MS SQL Server
  • Data Modelling,ELT, ETL Methodology

 

 

 

 

 

 

Read more
Oneture Technologies
at Oneture Technologies
1 recruiter
Ravi Mevcha
Posted by Ravi Mevcha
Mumbai, Navi Mumbai
2 - 4 yrs
₹8L - ₹12L / yr
Spark
Big Data
ETL
Data engineering
ADF
+4 more

Job Overview


We are looking for a Data Engineer to join our data team to solve data-driven critical

business problems. The hire will be responsible for expanding and optimizing the existing

end-to-end architecture including the data pipeline architecture. The Data Engineer will

collaborate with software developers, database architects, data analysts, data scientists and platform team on data initiatives and will ensure optimal data delivery architecture is

consistent throughout ongoing projects. The right candidate should have hands on in

developing a hybrid set of data-pipelines depending on the business requirements.

Responsibilities

  • Develop, construct, test and maintain existing and new data-driven architectures.
  • Align architecture with business requirements and provide solutions which fits best
  • to solve the business problems.
  • Build the infrastructure required for optimal extraction, transformation, and loading
  • of data from a wide variety of data sources using SQL and Azure ‘big data’
  • technologies.
  • Data acquisition from multiple sources across the organization.
  • Use programming language and tools efficiently to collate the data.
  • Identify ways to improve data reliability, efficiency and quality
  • Use data to discover tasks that can be automated.
  • Deliver updates to stakeholders based on analytics.
  • Set up practices on data reporting and continuous monitoring

Required Technical Skills

  • Graduate in Computer Science or in similar quantitative area
  • 1+ years of relevant work experience as a Data Engineer or in a similar role.
  • Advanced SQL knowledge, Data-Modelling and experience working with relational
  • databases, query authoring (SQL) as well as working familiarity with a variety of
  • databases.
  • Experience in developing and optimizing ETL pipelines, big data pipelines, and datadriven
  • architectures.
  • Must have strong big-data core knowledge & experience in programming using Spark - Python/Scala
  • Experience with orchestrating tool like Airflow or similar
  • Experience with Azure Data Factory is good to have
  • Build processes supporting data transformation, data structures, metadata,
  • dependency and workload management.
  • Experience supporting and working with cross-functional teams in a dynamic
  • environment.
  • Good understanding of Git workflow, Test-case driven development and using CICD
  • is good to have
  • Good to have some understanding of Delta tables It would be advantage if the candidate also have below mentioned experience using
  • the following software/tools:
  • Experience with big data tools: Hadoop, Spark, Hive, etc.
  • Experience with relational SQL and NoSQL databases
  • Experience with cloud data services
  • Experience with object-oriented/object function scripting languages: Python, Scala, etc.
Read more
Global Media Agency
Gurugram
2 - 6 yrs
₹10L - ₹12L / yr
SQL
Campaign Analytics
skill iconGoogle Analytics
skill iconData Analytics
We are looking for a Manager - Campaign Analytics for one of the leading Global Media agencies in Delhi.
 
Role - Manager (Campaign Analytics)
Experience - 2 to 6 years
Location - Gurgaon/Gurugram, Delhi

About our Client:-

Our client is part of the global media agency with mission is to make advertising more valuable to the world. They do this by employing the world's very best talent to solve some of the toughest challenges of today's digital marketing landscape. They hire people whose values reflect those of our own: genuine, results-focused, daring and insightful. They promise you a workplace that invests in your career, cares for you and is fun and engaging.

About the role:-

- Accountable for quantifying and measuring the success of our executions and for delivering insights that enable us to innovate the work we deliver at Essence. You will be supporting analyses for campaigns and helping the development of new analytical tools.

Some of the things we’d like you to do :-

● Take ownership of delivery for designated clients/projects
● Collaborate with campaign team to identify opportunities where our analytics offering can add
value
● Gain an understanding of marketing plans and their objectives to be able to assist with comprehensive measurement, and test & learn plans
● Develop analytical techniques to analyze results from experiments/studies/observations
● Explore methods/techniques for solving business problems given the data available (potentially 3rd party)
● Help internal and external clients understand the capability and limitations of methods used
● Develop and present compelling and innovative presentations.

A bit about yourself :-

● Degree from a top-tier College, 3.0 GPA or equivalent (preferably numerical)
● Proficiency with systems such as SQL, Social Analytics tools (Crimson Hexagon), Python, and ‘R’
● Have some understanding of marketing campaigns and their objectives
● Strong analytical skills - ability to analyze raw data, find insights, and provide actionable strategic recommendations
● Ability to manage someone effectively to bring out the best in their skill sets, motivating them to succeed, and keeping their focus
● Strong work ethic, with ability to manage multiple projects, people, and time zones to meet deadlines and deliver results
Read more
Service based company
Remote only
3 - 8 yrs
₹8L - ₹13L / yr
pandas
PySpark
Big Data
Data engineering
Performance optimixation
+3 more
Data pre-processing, data transformation, data analysis, and feature engineering, 
The candidate must have Expertise in ADF(Azure data factory), well versed with python.
Performance optimization of scripts (code) and Productionizing of code (SQL, Pandas, Python or PySpark, etc.)
Required skills:
Bachelors in - in Computer Science, Data Science, Computer Engineering, IT or equivalent
Fluency in Python (Pandas), PySpark, SQL, or similar
Azure data factory experience (min 12 months)
Able to write efficient code using traditional, OO concepts, modular programming following the SDLC process.
Experience in production optimization and end-to-end performance tracing (technical root cause analysis)
Ability to work independently with demonstrated experience in project or program management
Azure experience ability to translate data scientist code in Python and make it efficient (production) for cloud deployment
Read more
USEReady
at USEReady
1 recruiter
Shree Murthy
Posted by Shree Murthy
Bengaluru (Bangalore)
4 - 10 yrs
₹5L - ₹15L / yr
spotfire
alteryx
brist
MSBI
POWER BI
+3 more
Sr BI Analyst / Lead BI Analyst Must Have: At least 3 to 6 years of techno functional / technical experience in delivering projects for Large Enterprises Strong background in data analysis, data sources, data warehouse and business intelligence Progressive experience in areas of ETL, Data Modelling, Data Visualization, Reporting and Analytics Familiarity with database concepts, SQL queries, joins and programming Experience in system administration, problem debugging, and solution design Formal education in technology, engineering and programming Ability to demonstrate strong technical aptitude Experience in handling customer communication Ability to manage individual contribution and workload, setting expectations and self-learning Familiarity with self-service BI products such as Tableau, Qlikview, Spotfire, Power BI, MSBI Additional Requirements (Nice to have): Understanding of data challenges encountered by business users Domain expertise to any of the following domains highly desired - Finance, Banking, Insurance, Healthcare, Pharmaceuticals
Read more
Largest Analytical firm
Bengaluru (Bangalore)
4 - 14 yrs
₹10L - ₹28L / yr
Hadoop
Big Data
Spark
skill iconScala
skill iconPython
+2 more

·        Advanced Spark Programming Skills

·        Advanced Python Skills

·        Data Engineering ETL and ELT Skills

·        Expertise on Streaming data

·        Experience in Hadoop eco system

·        Basic understanding of Cloud Platforms

·        Technical Design Skills, Alternative approaches

·        Hands on expertise on writing UDF’s

·        Hands on expertise on streaming data ingestion

·        Be able to independently tune spark scripts

·        Advanced Debugging skills & Large Volume data handling.

·        Independently breakdown and plan technical Tasks

Read more
Yulu Bikes
at Yulu Bikes
1 video
3 recruiters
Keerthana k
Posted by Keerthana k
Bengaluru (Bangalore)
1 - 2 yrs
₹7L - ₹12L / yr
skill iconData Science
skill iconData Analytics
SQL
skill iconPython
Datawarehousing
+2 more
Skill Set 
SQL, Python, Numpy,Pandas,Knowledge of Hive and Data warehousing concept will be a plus point.

JD 

- Strong analytical skills with the ability to collect, organise, analyse and interpret trends or patterns in complex data sets and provide reports & visualisations.

- Work with management to prioritise business KPIs and information needs Locate and define new process improvement opportunities.

- Technical expertise with data models, database design and development, data mining and segmentation techniques

- Proven success in a collaborative, team-oriented environment

- Working experience with geospatial data will be a plus.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos