Cutshort logo
MNC logo
Data Engineer
at MNC
MNC's logo

Data Engineer

at MNC

Agency job
2 - 5 yrs
₹7L - ₹12L / yr
Bengaluru (Bangalore)
Skills
Spark
skill iconPython
SQL
Primary Responsibilities:
• Responsible for developing and maintaining applications with PySpark
• Contribute to the overall design and architecture of the application developed and deployed.
• Performance Tuning wrt to executor sizing and other environmental parameters, code optimization, partitions tuning, etc.
• Interact with business users to understand requirements and troubleshoot issues.
• Implement Projects based on functional specifications.


Must-Have Skills:

• Good experience in Pyspark - Including Dataframe core functions and Spark SQL
• Good customer communication.
• Good Analytical skills
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About MNC

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

TIFIN FINTECH
at TIFIN FINTECH
1 recruiter
Vrishali Mishra
Posted by Vrishali Mishra
Mumbai
2 - 5 yrs
Best in industry
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+2 more

Quant Research, TIFIN

Mumbai, India


WHO WE ARE:

TIFIN is a fintech platform backed by industry leaders including JP Morgan, Morningstar, Broadridge, Hamilton Lane, Franklin Templeton, Motive Partners and a who’s who of the financial service industry. We are creating engaging wealth experiences to better financial lives through AI and investment intelligence powered personalization. We are working to change the world of wealth in ways that personalization has changed the world of movies, music and more but with the added responsibility of delivering better wealth outcomes.

We use design and behavioral thinking to enable engaging experiences through software and application programming interfaces (APIs). We use investment science and intelligence to build algorithmic engines inside the software and APIs to enable better investor outcomes.

In a world where every individual is unique, we match them to financial advice and investments with a recognition of their distinct needs and goals across our investment marketplace and our advice and planning divisions.


OUR VALUES: Go with your GUT

●     Grow at the Edge. We are driven by personal growth. We get out of our comfort zone and keep egos aside to find our genius zones. With self-awareness and integrity we strive to be the best we can possibly be. No excuses.

●     Understanding through Listening and Speaking the Truth. We value transparency. We communicate with radical candor, authenticity and precision to create a shared understanding. We challenge, but once a decision is made, commit fully.

●     I Win for Teamwin. We believe in staying within our genius zones to succeed and we take full ownership of our work. We inspire each other with our energy and attitude. We fly in formation to win together.

 

 

WHAT YOU'LL BE DOING:

We are looking for an experienced quantitative professional to develop, implement, test, and maintain the core algorithms and R&D framework for our Investment and investment advisory platform. The ideal candidate for this role has successfully implemented and maintained quantitative and statistical modules using modular software design constructs. The candidate needs to be a responsible product owner, a problem solver and a team player looking to make a significant impact on a fast-growing company. The successful candidate will directly report to the Head of Quant Research & Development.

 

 

RESPONSIBILITIES:

  • The end-to-end research, development, and maintenance of investment platform, data and algorithms
  • Take part in building out the R&D back testing and simulation engines
  • Thoroughly vet investment algorithmic results 
  • Contribute to the research data platform design
  • Investigate datasets for use in new or existing algorithms
  • Participate in agile development practices
  • Liaise with stakeholders to gather & understand the functional requirements
  • Take part in code reviews ensuring quality meets highest level of standards
  • Develop software using high quality standards and best practices, conduct thorough end-to-end unit testing, and provide support during testing and post go-live
  • Support research innovation through the creative and aggressive experimentation of cutting-edge hardware, software, processes, procedures, and methods
  • Collaborate with technology teams to ensure appropriate requirements, standards, and integration

 

REQUIREMENTS:

  • Experience in a quant research & development role
  • Proficient in Python, Git and Jira
  • Knowledge in SQL and database development (PostgreSQL is a plus)
  • Understanding of R and RMarkdown is a plus
  • Bachelor’s degree in computer science, computational mathematics, or financial engineering 
  • Master’s degree or advanced training is a strong plus
  • Excellent mathematical foundation and hands-on experience working in the finance industry
  • Proficient in quantitative, statistical, and ML/AI techniques and their implementation using Python modules such as Pandas, NumPy, SciPy, SciKit-Learn, etc.
  • Strong communication (written and oral) and analytical problem-solving skills
  • Strong sense of attention to detail, pride in delivering high quality work and willingness to learn
  • An understanding of or exposure to financial capital markets, various financial instruments (such as stocks, ETFs, Mutual Funds, etc.), and financial tools (such as Bloomberg, Reuters, etc.)


 

BENEFITS PACKAGE:

TIFIN offers a competitive benefits package that includes:

· Performance-linked variable compensation.

· Medical insurance

· Tax saving benefits

· Flexible PTO policy and Company-paid holidays

· Parental Leave: 6 month paid maternity, 2 week paid paternity leave

· Access to our Wellness trainers, including 1:1 personal coaching

for executives and rising stars

 

A note on location. While we have team centres in Boulder, New York City, San Francisco, Charlotte, and Bangalore,this role is based out of Mumbai.  

 

TIFIN is proud to be an equal opportunity workplace and values the multitude of talents and perspectives that a diverse workforce brings. All qualified applicants will receive consideration for employment without regard to race, national origin, religion, age, color, sex, sexual orientation, gender identity, disability, or protected veteran status.

 

 

 

 

 

Read more
Antuit
at Antuit
1 recruiter
Purnendu Shakunt
Posted by Purnendu Shakunt
Bengaluru (Bangalore)
8 - 12 yrs
₹25L - ₹30L / yr
skill iconData Science
skill iconMachine Learning (ML)
Artificial Intelligence (AI)
Data Scientist
skill iconPython
+9 more

About antuit.ai

 

Antuit.ai is the leader in AI-powered SaaS solutions for Demand Forecasting & Planning, Merchandising and Pricing. We have the industry’s first solution portfolio – powered by Artificial Intelligence and Machine Learning – that can help you digitally transform your Forecasting, Assortment, Pricing, and Personalization solutions. World-class retailers and consumer goods manufacturers leverage antuit.ai solutions, at scale, to drive outsized business results globally with higher sales, margin and sell-through.

 

Antuit.ai’s executives, comprised of industry leaders from McKinsey, Accenture, IBM, and SAS, and our team of Ph.Ds., data scientists, technologists, and domain experts, are passionate about delivering real value to our clients. Antuit.ai is funded by Goldman Sachs and Zodius Capital.

 

The Role:

 

Antuit.ai is interested in hiring a Principal Data Scientist, this person will facilitate standing up standardization and automation ecosystem for ML product delivery, he will also actively participate in managing implementation, design and tuning of product to meet business needs.

 

Responsibilities:

 

Responsibilities includes, but are not limited to the following:

 

  • Manage and provides technical expertise to the delivery team. This includes recommendation of solution alternatives, identification of risks and managing business expectations.
  • Design, build reliable and scalable automated processes for large scale machine learning.
  • Use engineering expertise to help design solutions to novel problems in software development, data engineering, and machine learning. 
  • Collaborate with Business, Technology and Product teams to stand-up MLOps process.
  • Apply your experience in making intelligent, forward-thinking, technical decisions to delivery ML ecosystem, including implementing new standards, architecture design, and workflows tools.
  • Deep dive into complex algorithmic and product issues in production
  • Own metrics and reporting for delivery team. 
  • Set a clear vision for the team members and working cohesively to attain it.
  • Mentor and coach team members


Qualifications and Skills:

 

Requirements

  • Engineering degree in any stream
  • Has at least 7 years of prior experience in building ML driven products/solutions
  • Excellent programming skills in any one of the language C++ or Python or Java.
  • Hands on experience on open source libraries and frameworks- Tensorflow,Pytorch, MLFlow, KubeFlow, etc.
  • Developed and productized large-scale models/algorithms in prior experience
  • Can drive fast prototypes/proof of concept in evaluating various technology, frameworks/performance benchmarks.
  • Familiar with software development practices/pipelines (DevOps- Kubernetes, docker containers, CI/CD tools).
  • Good verbal, written and presentation skills.
  • Ability to learn new skills and technologies.
  • 3+ years working with retail or CPG preferred.
  • Experience in forecasting and optimization problems, particularly in the CPG / Retail industry preferred.

 

Information Security Responsibilities

 

  • Understand and adhere to Information Security policies, guidelines and procedure, practice them for protection of organizational data and Information System.
  • Take part in Information Security training and act accordingly while handling information.
  • Report all suspected security and policy breach to Infosec team or appropriate authority (CISO).

EEOC

 

Antuit.ai is an at-will, equal opportunity employer.  We consider applicants for all positions without regard to race, color, religion, national origin or ancestry, gender identity, sex, age (40+), marital status, disability, veteran status, or any other legally protected status under local, state, or federal law.
Read more
Consulting and Services company
Hyderabad, Ahmedabad
5 - 10 yrs
₹5L - ₹30L / yr
skill iconAmazon Web Services (AWS)
Apache
skill iconPython
PySpark

Data Engineer 

  

Mandatory Requirements  

  • Experience in AWS Glue 
  • Experience in Apache Parquet  
  • Proficient in AWS S3 and data lake  
  • Knowledge of Snowflake 
  • Understanding of file-based ingestion best practices. 
  • Scripting language - Python & pyspark 

 

CORE RESPONSIBILITIES 

  • Create and manage cloud resources in AWS  
  • Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies  
  • Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform  
  • Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations  
  • Develop an infrastructure to collect, transform, combine and publish/distribute customer data. 
  • Define process improvement opportunities to optimize data collection, insights and displays. 
  • Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible  
  • Identify and interpret trends and patterns from complex data sets  
  • Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.  
  • Key participant in regular Scrum ceremonies with the agile teams   
  • Proficient at developing queries, writing reports and presenting findings  
  • Mentor junior members and bring best industry practices  

 

QUALIFICATIONS 

  • 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)  
  • Strong background in math, statistics, computer science, data science or related discipline 
  • Advanced knowledge one of language: Java, Scala, Python, C#  
  • Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake   
  • Proficient with 
  • Data mining/programming tools (e.g. SAS, SQL, R, Python) 
  • Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum) 
  • Data visualization (e.g. Tableau, Looker, MicroStrategy) 
  • Comfortable learning about and deploying new technologies and tools.  
  • Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.  
  • Good written and oral communication skills and ability to present results to non-technical audiences  
  • Knowledge of business intelligence and analytical tools, technologies and techniques. 

 

Familiarity and experience in the following is a plus:  

  • AWS certification 
  • Spark Streaming  
  • Kafka Streaming / Kafka Connect  
  • ELK Stack  
  • Cassandra / MongoDB  
  • CI/CD: Jenkins, GitLab, Jira, Confluence other related tools 
Read more
Promilo
Karina Biswal
Posted by Karina Biswal
Bengaluru (Bangalore)
2 - 8 yrs
₹1.5L - ₹3L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+7 more

Designation: business analyst

Company name - promilo.com (sawara solutions pvt ltd)

Experience – 2 - 8 yrs.

Location: Bangalore

Mode – full time / work from office


About us:

Promilo is India’s 1st innovative platform which “pay to browse”

It is a b2b SaaS start-up that enables to accelerate  the business appointment funnel of the

Companies. We’re an SaaS based advertising platform that connects both users & advertisers. Users will be able to book an online appointment based on their interests with the advertiser, without compromising their data privacy and get rewarded for sharing their data and time. We’re registered and recognized by start-up India, start-up Karnataka & MSME companies. Also, the top 100 Google AppScale academy start-up


Job description:


We are looking for an experienced business analyst to join our team. The ideal candidate will have 2-8 years of experience in web & mobile user & client data analyst for start-ups, with a strong passion to help start-ups and a proven track record to bring the strong business insight to improve the sales, marketing, user, client, ui, ux of the organisation.


Responsibilities

  • Requirement gathering and analyzing
  • Conduct gap analysis, assess scope & suggest solutions
  • Responsible for technical proposal writing and time and cost analysis for web and mobile application development
  • Preparing rfp/rfq
  • Would be involved in presales activities
  • Work as liaison between client and technical team
  • Create wireframe | prototype | feature list | srs | brd & flow diagrams as per the client's requirement
  • High it literacy proven use of web and associated technologies (excel, power point, google apps).
  • Previous experience with data visualization tools (tableau, power bi, etc.), is strongly preferred.
  • Cleanse and curate sourced data into standardized reporting templates
  • Create, document, validate and ensure delivery of ad hoc, daily weekly, and monthly reports to internal stakeholders
  • Create, validate and deliver tracking links for the marketing department
  • Assist in the creation, qa, validation and reporting of a/b and multivariate tests
  • Proactively monitor the marketing kpis and ua data on a daily basis
  • Analyze marketing ua performance and conduct deep dive analysis to answer hypotheses and questions posed by the team
  • Gather, transform, and analyze digital marketing data, including paid media, search, social, website, and conversion funnel analytics
  • Analyze marketing data searching for top of funnel growth opportunities
  • Analyze product data searching for insights to increase app engagement, conversion, and retention
  • Analyze ltv/cac drivers to support overall business growth
  • Partner with marketing, product, and growth teams
  • Present findings to stakeholders and make recommendations for spend targets and campaign strategies
  • Pov on ios 14 and upcoming android privacy changes and we can navigate tracking in light of these changes
  • Pov on transition to skan 4.0
  • Working knowledge of statistical techniques (regression, k-means clustering, pca)
  • Experience with lift studies and marketing mix modeling working experience with python, r, & dbt
  • Experience at a small company
  • Experience with a subscription business
  • Analyze website and mobile app data on traffic sources and patterns. Provide insight on data trends and anomalies, making recommendations where appropriate to improve business performance.


Qualification

  • Master's or bachelor degree in computer science
  • Well-versed with its technologies
  • 2+ years of business analysis or project analysis experience
  • Tech-savvy with proficiency in Microsoft office, google apps, and other web and mobile applications
  • Excellent written and verbal communication skills
  • Self-motivated, flexible, and comfortable with a fast-paced startup environment
  • Advanced experience with Excel, google sheets including an understanding of visualizations, pivot tables, vlookup, and other key functions
  • Experience with adobe analytics, google analytics, tableau, SQL, and data grid is a plus.
  • Strong analytical and problem-solving skills, with clear attention to detail
  • Ability to prioritize and work under tight deadlines
  • Fast learner, able to master new concepts, theories, ideas, and processes with ease
  • Experience creating user acquisition reports and dashboards
  • Deep understanding of mobile attribution, cohort analysis, customer segmentation, and ltv modeling
  • Experience pulling data and creating databases using an API of at least one of these ad platforms; Facebook, Snapchat, TikTok, google ads, applovin
  • Experience with the architecture and deployment of mobile tracking solutions, including SDK integration, ad platforms APIs, server-postbacks, and mmp providers such as Appsflyer, adjust, kochava


If you are data driven individual with a passion for start-ups and have experience in business analytics, we encourage you to apply for this position. We offer a competitive salary package, flexible working hours, and a supportive work environment that fosters growth and development.


Read more
RandomTrees
at RandomTrees
1 recruiter
Amareswarreddt yaddula
Posted by Amareswarreddt yaddula
Remote only
5 - 10 yrs
₹1L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+6 more

Job Title: Senior Data Engineer

Experience: 8Yrs to 11Yrs

Location: Remote

Notice: Immediate or Max 1Month

Role: Permanent Role


Skill set: Google Cloud Platform, Big Query, Java, Python Programming Language, Airflow, Data flow, Apache Beam.


Experience required:

5 years of experience in software design and development with 4 years of experience in the data engineering field is preferred.

2 years of Hands-on experience in GCP cloud data implementation suites such as Big Query, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, Cloud Storage, etc.

Strong experience and understanding of very large-scale data architecture, solutions, and operationalization of data warehouses, data lakes, and analytics platforms.

Mandatory 1 year of software development skills using Java or Python.

Extensive hands-on experience working with data using SQL and Python.


Must Have: GCP, Big Query, Airflow, Data flow, Python, Java.


GCP knowledge must

Java as programming language(preferred)

Big Query, Pub-Sub, Data Flow/Apache Beam, Airflow/Composer, Cloud Storage,

Python

Communication should be good.


Read more
Ganit Business Solutions
at Ganit Business Solutions
3 recruiters
Viswanath Subramanian
Posted by Viswanath Subramanian
Remote, Chennai, Bengaluru (Bangalore), Mumbai
3 - 7 yrs
₹12L - ₹25L / yr
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Computer Vision
skill iconR Programming
+5 more

Ganit has flipped the data science value chain as we do not start with a technique but for us, consumption comes first. With this philosophy, we have successfully scaled from being a small start-up to a 200 resource company with clients in the US, Singapore, Africa, UAE, and India. 

We are looking for experienced data enthusiasts who can make the data talk to them. 

 

You will: 

  • Understand business problems and translate business requirements into technical requirements. 
  • Conduct complex data analysis to ensure data quality & reliability i.e., make the data talk by extracting, preparing, and transforming it. 
  • Identify, develop and implement statistical techniques and algorithms to address business challenges and add value to the organization. 
  • Gather requirements and communicate findings in the form of a meaningful story with the stakeholders  
  • Build & implement data models using predictive modelling techniques. Interact with clients and provide support for queries and delivery adoption. 
  • Lead and mentor data analysts. 

 

We are looking for someone who has: 

 

  • Apart from your love for data and ability to code even while sleeping you would need the following. 
  • Minimum of 02 years of experience in designing and delivery of data science solutions. 
  • You should have successful projects of retail/BFSI/FMCG/Manufacturing/QSR in your kitty to show-off. 
  • Deep understanding of various statistical techniques, mathematical models, and algorithms to start the conversation with the data in hand. 
  • Ability to choose the right model for the data and translate that into a code using R, Python, VBA, SQL, etc. 
  • Bachelors/Masters degree in Engineering/Technology or MBA from Tier-1 B School or MSc. in Statistics or Mathematics 

Skillset Required:

  • Regression
  • Classification
  • Predictive Modelling
  • Prescriptive Modelling
  • Python
  • R
  • Descriptive Modelling
  • Time Series
  • Clustering
  •  

What is in it for you: 

 

  • Be a part of building the biggest brand in Data science. 
  • An opportunity to be a part of a young and energetic team with a strong pedigree. 
  • Work on awesome projects across industries and learn from the best in the industry, while growing at a hyper rate. 

 

Please Note:  

 

At Ganit, we are looking for people who love problem solving. You are encouraged to apply even if your experience does not precisely match the job description above. Your passion and skills will stand out and set you apart—especially if your career has taken some extraordinary twists and turns over the years. We welcome diverse perspectives, people who think rigorously and are not afraid to challenge assumptions in a problem. Join us and punch above your weight! 

Ganit is an equal opportunity employer and is committed to providing a work environment that is free from harassment and discrimination. 

All recruitment, selection procedures and decisions will reflect Ganit’s commitment to providing equal opportunity. All potential candidates will be assessed according to their skills, knowledge, qualifications, and capabilities. No regard will be given to factors such as age, gender, marital status, race, religion, physical impairment, or political opinions. 

Read more
Hyderabad
12 - 20 yrs
₹15L - ₹50L / yr
Analytics
skill iconData Analytics
skill iconKubernetes
PySpark
skill iconPython
+1 more

Job Description

We are looking for an experienced engineer with superb technical skills. Primarily be responsible for architecting and building large scale data pipelines that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions.

Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.

 

Skills

  • Bachelors/Masters/Phd in CS or equivalent industry experience
  • Demonstrated expertise of building and shipping cloud native applications
  • 5+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka Streams, Py Spark, and streaming databases like druid or equivalent like Hive
  • Strong industry expertise with containerization technologies including kubernetes (EKS/AKS), Kubeflow
  • Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
  • 5+ Industry experience in python
  • Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
  • Experience with scripting languages. Python experience highly desirable. Experience in API development using Swagger
  • Implementing automated testing platforms and unit tests
  • Proficient understanding of code versioning tools, such as Git
  • Familiarity with continuous integration, Jenkins

Responsibilities

  • Architect, Design and Implement Large scale data processing pipelines using Kafka Streams, PySpark, Fluentd and Druid
  • Create custom Operators for Kubernetes, Kubeflow
  • Develop data ingestion processes and ETLs
  • Assist in dev ops operations
  • Design and Implement APIs
  • Identify performance bottlenecks and bugs, and devise solutions to these problems
  • Help maintain code quality, organization, and documentation
  • Communicate with stakeholders regarding various aspects of solution.
  • Mentor team members on best practices
Read more
Marktine
at Marktine
1 recruiter
Vishal Sharma
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
3 - 6 yrs
₹10L - ₹25L / yr
Cloud
Google Cloud Platform (GCP)
BigQuery
skill iconPython
SQL
+2 more

Specific Responsibilities

  • Minimum of 2 years Experience in Google Big Query and Google Cloud Platform.
  • Design and develop the ETL framework using BigQuery
  • Expertise in Big Query concepts like Nested Queries, Clustering, Partitioning, etc.
  • Working Experience of Clickstream database, Google Analytics/ Adobe Analytics.
  • Should be able to automate the data load from Big Query using APIs or scripting language.
  • Good experience in Advanced SQL concepts.
  • Good experience with Adobe launch Web, Mobile & e-commerce tag implementation.
  • Identify complex fuzzy problems, break them down in smaller parts, and implement creative, data-driven solutions
  • Responsible for defining, analyzing, and communicating key metrics and business trends to the management teams
  • Identify opportunities to improve conversion & user experience through data. Influence product & feature roadmaps.
  • Must have a passion for data quality and be constantly looking to improve the system. Drive data-driven decision making through the stakeholders & drive Change Management
  • Understand requirements to translate business problems & technical problems into analytics problems.
  • Effective storyboarding and presentation of the solution to the client and leadership.
  • Client engagement & management
  • Ability to interface effectively with multiple levels of management and functional disciplines.
  • Assist in developing/coaching individuals technically as well as on soft skills during the project and as part of Client Project’s training program.

 

Work Experience
  • 2 to 3 years of working experience in Google Big Query & Google Cloud Platform
  • Relevant experience in Consumer Tech/CPG/Retail industries
  • Bachelor’s in engineering, Computer Science, Math, Statistics or related discipline
  • Strong problem solving and web analytical skills. Acute attention to detail.
  • Experience in analyzing large, complex, multi-dimensional data sets.
  • Experience in one or more roles in an online eCommerce or online support environment.
 
Skills
  • Expertise in Google Big Query & Google Cloud Platform
  • Experience in Advanced SQL, Scripting language (Python/R)
  • Hands-on experience in BI tools (Tableau, Power BI)
  • Working Experience & understanding of Adobe Analytics or Google Analytics
  • Experience in creating and debugging website & app tracking (Omnibus, Dataslayer, GA debugger, etc.)
  • Excellent analytical thinking, analysis, and problem-solving skills.
  • Knowledge of other GCP services is a plus
 
Read more
Saviance Technologies
at Saviance Technologies
1 recruiter
Shipra Agrawal
Posted by Shipra Agrawal
NCR (Delhi | Gurgaon | Noida)
3 - 5 yrs
₹7L - ₹9L / yr
PowerBI
power bi
Business Intelligence (BI)
DAX
Data modeling
+3 more

 

Job Title: Power BI Developer(Onsite)

Location: Park Centra, Sec 30, Gurgaon

CTC:        8 LPA

Time:       1:00 PM - 10:00 PM

  

Must Have Skills: 

  • Power BI Desktop Software
  • Dax Queries
  • Data modeling
  • Row-level security
  • Visualizations
  • Data Transformations and filtering
  • SSAS and SQL

 

Job description:

 

We are looking for a PBI Analytics Lead responsible for efficient Data Visualization/ DAX Queries and Data Modeling. The candidate will work on creating complex Power BI reports. He will be involved in creating complex M, Dax Queries and working on data modeling, Row-level security, Visualizations, Data Transformations, and filtering. He will be closely working with the client team to provide solutions and suggestions on Power BI.

 

Roles and Responsibilities:

 

  • Accurate, intuitive, and aesthetic Visual Display of Quantitative Information: We generate data, information, and insights through our business, product, brand, research, and talent teams. You would assist in transforming this data into visualizations that represent easy-to-consume visual summaries, dashboards and storyboards. Every graph tells a story.
  • Understanding Data: You would be performing and documenting data analysis, data validation, and data mapping/design. You would be mining large datasets to determine its characteristics and select appropriate visualizations.
  • Project Owner: You would develop, maintain, and manage advanced reporting, analytics, dashboards and other BI solutions, and would be continuously reviewing and improving existing systems and collaborating with teams to integrate new systems. You would also contribute to the overall data analytics strategy by knowledge sharing and mentoring end users.
  • Perform ongoing maintenance & production of Management dashboards, data flows, and automated reporting.
  • Manage upstream and downstream impact of all changes on automated reporting/dashboards
  • Independently apply problem-solving ability to identify meaningful insights to business
  • Identify automation opportunities and work with a wide range of stakeholders to implement the same.
  • The ability and self-confidence to work independently and increase the scope of the service line

 

Requirements: 

  • 3+ years of work experience as an Analytics Lead / Senior Analyst / Sr. PBI Developer.
  • Sound understanding and knowledge of PBI Visualization and Data Modeling with DAX queries
  • Experience in leading and mentoring a small team.

 

 

 

Read more
Sameeksha Capital
Ahmedabad
1 - 2 yrs
₹1L - ₹3L / yr
skill iconJava
skill iconPython
Data Structures
Algorithms
skill iconC++
+1 more
Looking for Alternative Data Programmer for equity fund
The programmer should be proficient in python and should be able to work totally independently. Should also have skill to work with databases and have strong capability to understand how to fetch data from various sources, organise the data and identify useful information through efficient code.
Familiarity with Python 
Some examples of work: 
Text search on earnings transcripts for keywords to identify future trends.  
Integration of internal and external financial database
Web scraping to capture clean and organize data
Automatic updating of our financial models by importing data from machine readable formats such as XBRL 
Fetching data from public databases such as RBI, NSE, BSE, DGCA and process the same. 
Back-testing of data either to test historical cause and effect relation on market performance/portfolio performance, as well as back testing of our screener criteria in devising strategy
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos