Cutshort logo
Data Engineer
3 - 5 yrs
Best in industry
Gurugram
Skills
Data Warehouse (DWH)
Informatica
ETL
skill iconPython
SQL
PowerBI
Tableau
skill iconAmazon Web Services (AWS)
Snow flake schema
Snowflake

The Client is the world’s largest media investment company. Our team of experts support clients in programmatic, social, paid search, analytics, technology, organic search, affiliate marketing, e-commerce and across traditional channel We are currently looking for a Manager Analyst – Analytics to join us. In this role, you will work on

various projects for the in-house team across data management, reporting, and analytics.


Responsibility:

 

•       Serve as a Subject Matter Expert on data usage – extraction, manipulation, and inputs for analytics

•       Develop data extraction and manipulation code based on business rules

•       Design and construct data store and procedures for their maintenance  Develop and maintain strong relationships with stakeholders  Write high-quality code as per prescribed standards.

•       Participate in internal projects as required


Requirements:

 

•       2-5 years for strong experience in working with SQL, Python, ETL development.

•       Strong Experience in writing complex SQLs

•       Good Communication skills

•       Good experience of working with any BI tool like Tableau, Power BI.

•       Familiar with various cloud technologies and their offerings within the data specialization and Data Warehousing.

•       Snowflake, AWS are good to have.

 

Minimum qualifications:

•       B. Tech./MCA or equivalent preferred

Excellent 2 years Hand on experience on Big data, ETL Development, Data Processing.  

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About My Client is the world’s largest media investment company.

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

Fatakpay
at Fatakpay
2 recruiters
Ajit Kumar
Posted by Ajit Kumar
Mumbai
2 - 5 yrs
₹8L - ₹15L / yr
skill iconPython
Risk analysis
credit card

Job Title: Credit Risk Analyst

Company: FatakPay FinTech

Location: Mumbai, India

Salary Range: INR 8 - 15 Lakhs per annum

Job Description:

FatakPay, a leading player in the fintech sector, is seeking a dynamic and skilled Credit Risk Analyst to join our team in Mumbai. This position is tailored for professionals who are passionate about leveraging technology to enhance financial services. If you have a strong background in engineering and a keen eye for risk management, we invite you to be a part of our innovative journey.

Key Responsibilities:

  • Conduct thorough risk assessments by analyzing borrowers' financial data, including financial statements, credit scores, and income details.
  • Develop and refine predictive models using advanced statistical methods to forecast loan defaults and assess creditworthiness.
  • Collaborate in the formulation and execution of credit policies and risk management strategies, ensuring compliance with regulatory standards.
  • Monitor and analyze the performance of loan portfolios, identifying trends, risks, and opportunities for improvement.
  • Stay updated with financial regulations and standards, ensuring all risk assessment processes are in compliance.
  • Prepare comprehensive reports on credit risk analyses and present findings to senior management.
  • Work closely with underwriting, finance, and sales teams to provide critical input influencing lending decisions.
  • Analyze market trends and economic conditions, adjusting risk assessment models and strategies accordingly.
  • Utilize cutting-edge financial technologies for more efficient and accurate data analysis.
  • Engage in continual learning to stay abreast of new tools, techniques, and best practices in credit risk management.

Qualifications:

  • Minimum qualification: B.Tech or Engineering degree from a reputed institution.
  • 2-4 years of experience in credit risk analysis, preferably in a fintech environment.
  • Proficiency in data analysis, statistical modeling, and machine learning techniques.
  • Strong analytical and problem-solving skills.
  • Excellent communication skills, with the ability to present complex data insights clearly.
  • A proactive approach to work in a fast-paced, technology-driven environment.
  • Up-to-date knowledge of financial regulations and compliance standards.


We look forward to discovering how your expertise and innovative ideas can contribute to the growth and success of FatakPay. Join us in redefining the future of fintech!





Read more
Databook
at Databook
5 candid answers
1 video
Nikhil Mohite
Posted by Nikhil Mohite
Mumbai
1 - 3 yrs
Upto ₹20L / yr (Varies
)
Data engineering
skill iconPython
Apache Kafka
Spark
skill iconAmazon Web Services (AWS)
+1 more

Lightning Job By Cutshort ⚡

 

As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)

 

 

About Databook:-

- Great salespeople let their customers’ strategies do the talking.

 

Databook’s award-winning Strategic Relationship Management (SRM) platform uses advanced AI and NLP to empower the world’s largest B2B sales teams to create, manage, and maintain strategic relationships at scale. The platform ingests and interprets billions of financial and market data signals to generate actionable sales strategies that connect the seller’s solutions to a buyer’s financial pain and urgency.

 

The Opportunity

We're seeking Junior Engineers to support and develop Databook’s capabilities. Working closely with our seasoned engineers, you'll contribute to crafting new features and ensuring our platform's reliability. If you're eager about playing a part in building the future of customer intelligence, with a keen eye towards quality, we'd love to meet you!

 

Specifically, you'll

- Participate in various stages of the engineering lifecycle alongside our experienced engineers.

- Assist in maintaining and enhancing features of the Databook platform.

- Collaborate with various teams to comprehend requirements and aid in implementing technology solutions.

 

Please note: As you progress and grow with us, you might be introduced to on-call rotations to handle any platform challenges.

 

Working Arrangements:

- This position offers a hybrid work mode, allowing employees to work both remotely and in-office as mutually agreed upon.

 

What we're looking for

- 1-2+ years experience as a Data Engineer

- Bachelor's degree in Engineering

- Willingness to work across different time zones

- Ability to work independently

- Knowledge of cloud (AWS or Azure)

- Exposure to distributed systems such as Spark, Flink or Kafka

- Fundamental knowledge of data modeling and optimizations

- Minimum of one year of experience using Python working as a Software Engineer

- Knowledge of SQL (Postgres) databases would be beneficial

- Experience with building analytics dashboard

- Familiarity with RESTful APIs and/or GraphQL is welcomed

- Hand-on experience with Numpy, Pandas, SpaCY would be a plus

- Exposure or working experience on GenAI (LLMs in general), LLMOps would be a plus

- Highly fluent in both spoken and written English language

 

Ideal candidates will also have:

- Self-motivated with great organizational skills.

- Ability to focus on small and subtle details.

- Are willing to learn and adapt in a rapidly changing environment.

- Excellent written and oral communication skills.

 

Join us and enjoy these perks!

- Competitive salary with bonus

- Medical insurance coverage

- 5 weeks leave plus public holidays

- Employee referral bonus program

- Annual learning stipend to spend on books, courses or other training materials that help you develop skills relevant to your role or professional development

- Complimentary subscription to Masterclass

Read more
TensorGo Software Private Limited
Deepika Agarwal
Posted by Deepika Agarwal
Remote only
5 - 8 yrs
₹5L - ₹15L / yr
skill iconPython
PySpark
apache airflow
Spark
Hadoop
+4 more

Requirements:

● Understanding our data sets and how to bring them together.

● Working with our engineering team to support custom solutions offered to the product development.

● Filling the gap between development, engineering and data ops.

● Creating, maintaining and documenting scripts to support ongoing custom solutions.

● Excellent organizational skills, including attention to precise details

● Strong multitasking skills and ability to work in a fast-paced environment

● 5+ years experience with Python to develop scripts.

● Know your way around RESTFUL APIs.[Able to integrate not necessary to publish]

● You are familiar with pulling and pushing files from SFTP and AWS S3.

● Experience with any Cloud solutions including GCP / AWS / OCI / Azure.

● Familiarity with SQL programming to query and transform data from relational Databases.

● Familiarity to work with Linux (and Linux work environment).

● Excellent written and verbal communication skills

● Extracting, transforming, and loading data into internal databases and Hadoop

● Optimizing our new and existing data pipelines for speed and reliability

● Deploying product build and product improvements

● Documenting and managing multiple repositories of code

● Experience with SQL and NoSQL databases (Casendra, MySQL)

● Hands-on experience in data pipelining and ETL. (Any of these frameworks/tools: Hadoop, BigQuery,

RedShift, Athena)

● Hands-on experience in AirFlow

● Understanding of best practices, common coding patterns and good practices around

● storing, partitioning, warehousing and indexing of data

● Experience in reading the data from Kafka topic (both live stream and offline)

● Experience in PySpark and Data frames

Responsibilities:

You’ll

● Collaborating across an agile team to continuously design, iterate, and develop big data systems.

● Extracting, transforming, and loading data into internal databases.

● Optimizing our new and existing data pipelines for speed and reliability.

● Deploying new products and product improvements.

● Documenting and managing multiple repositories of code.

Read more
Gurugram
1 - 5 yrs
₹6L - ₹10L / yr
skill iconR Programming
SPSS
skill iconPython
Surveying
skill iconData Analytics
Desired Skills & Mindset:

We are looking for candidates who have demonstrated both a strong business sense and deep understanding of the quantitative foundations of modelling.

• Excellent analytical and problem-solving skills, including the ability to disaggregate issues, identify root causes and recommend solutions
• Statistical programming software experience in SPSS and comfortable working with large data sets.
• R, Python, SAS & SQL are preferred but not a mandate
• Excellent time management skills
• Good written and verbal communication skills; understanding of both written and spoken English
• Strong interpersonal skills
• Ability to act autonomously, bringing structure and organization to work
• Creative and action-oriented mindset
• Ability to interact in a fluid, demanding and unstructured environment where priorities evolve constantly, and methodologies are regularly challenged
• Ability to work under pressure and deliver on tight deadlines

Qualifications and Experience
:

• Graduate degree in: Statistics/Economics/Econometrics/Computer
Science/Engineering/Mathematics/MBA (with a strong quantitative background) or
equivalent
• Strong track record work experience in the field of business intelligence, market
research, and/or Advanced Analytics
• Knowledge of data collection methods (focus groups, surveys, etc.)
• Knowledge of statistical packages (SPSS, SAS, R, Python, or similar), databases,
and MS Office (Excel, PowerPoint, Word)
• Strong analytical and critical thinking skills
• Industry experience in Consumer Experience/Healthcare a plus
Read more
Top startup of India -  News App
Noida
2 - 5 yrs
₹20L - ₹35L / yr
Linux/Unix
skill iconPython
Hadoop
Apache Spark
skill iconMongoDB
+4 more
Responsibilities
● Create and maintain optimal data pipeline architecture.
● Assemble large, complex data sets that meet functional / non-functional
business requirements.
● Building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Maintain, organize & automate data processes for various use cases.
● Identifying trends, doing follow-up analysis, preparing visualizations.
● Creating daily, weekly and monthly reports of product KPIs.
● Create informative, actionable and repeatable reporting that highlights
relevant business trends and opportunities for improvement.

Required Skills And Experience:
● 2-5 years of work experience in data analytics- including analyzing large data sets.
● BTech in Mathematics/Computer Science
● Strong analytical, quantitative and data interpretation skills.
● Hands-on experience with Python, Apache Spark, Hadoop, NoSQL
databases(MongoDB preferred), Linux is a must.
● Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Experience with Google Cloud Data Analytics Products such as BigQuery, Dataflow, Dataproc etc. (or similar cloud-based platforms).
● Experience working within a Linux computing environment, and use of
command-line tools including knowledge of shell/Python scripting for
automating common tasks.
● Previous experience working at startups and/or in fast-paced environments.
● Previous experience as a data engineer or in a similar role.
Read more
Kaleidofin
at Kaleidofin
3 recruiters
Poornima B
Posted by Poornima B
Chennai, Bengaluru (Bangalore)
3 - 8 yrs
Best in industry
skill iconData Science
skill iconMachine Learning (ML)
skill iconPython
SQL
Natural Language Processing (NLP)
4+ year experience in advanced analytics, model building, statistical modeling,
• Solid technical / data-mining skills and ability to work with large volumes of data; extract
and manipulate large datasets using common tools such as Python and SQL other
programming/scripting languages to translate data into business decisions/results
• Be data-driven and outcome-focused
• Must have good business judgment with demonstrated ability to think creatively and
strategically
• Must be an intuitive, organized analytical thinker, with the ability to perform detailed
analysis
• Takes personal ownership; Self-starter; Ability to drive projects with minimal guidance
and focus on high impact work
• Learns continuously; Seeks out knowledge, ideas and feedback.
• Looks for opportunities to build owns skills, knowledge and expertise.
• Experience with big data and cloud computing viz. Spark, Hadoop (MapReduce, PIG,
HIVE)
• Experience in risk and credit score domains preferred
• Comfortable with ambiguity and frequent context-switching in a fast-paced
environment
Read more
Hyderabad
5 - 8 yrs
₹12L - ₹25L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+6 more

Big Data Engineer: 5+ yrs.
Immediate Joiner

 

  • Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight
  • Experience in developing lambda functions with AWS Lambda
  • Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark
  • Should be able to code in Python and Scala.
  • Snowflake experience will be a plus
  • We can start keeping Hadoop and Hive requirements as good to have or understanding of is enough rather than keeping it as a desirable requirement.
Read more
Hex Business Innovations
Dhruv Dua
Posted by Dhruv Dua
Faridabad
0 - 4 yrs
₹1L - ₹3L / yr
SQL
SQL server
MySQL
MS SQLServer
skill iconC#
+1 more

Job Summary
SQL development for our Enterprise Resource Planning (ERP) Product offered to SMEs. Regular modifications , creation and validation with testing of stored procedures , views, functions on MS SQL Server.
Responsibilities and Duties
Understanding the ERP Software and use cases.
Regular Creation,modifications and testing of

  • Stored Procedures
  • Views
  • Functions
  • Nested Queries
  • Table and Schema Designs

Qualifications and Skills
MS SQL

  • Procedural Language
  • Datatypes
  • Objects
  • Databases
  • Schema
Read more
Artivatic
at Artivatic
1 video
3 recruiters
Layak Singh
Posted by Layak Singh
Bengaluru (Bangalore)
3 - 7 yrs
₹6L - ₹14L / yr
skill iconPython
skill iconMachine Learning (ML)
Artificial Intelligence (AI)
skill iconDeep Learning
Natural Language Processing (NLP)
+3 more
About Artivatic : Artivatic is a technology startup that uses AI/ML/Deep learning to build intelligent products & solutions for finance, healthcare & insurance businesses. It is based out of Bangalore with 25+ team focus on technology. The artivatic building is cutting edge solutions to enable 750 Millions plus people to get insurance, financial access, and health benefits with alternative data sources to increase their productivity, efficiency, automation power, and profitability, hence improving their way of doing business more intelligently & seamlessly.  - Artivatic offers lending underwriting, credit/insurance underwriting, fraud, prediction, personalization, recommendation, risk profiling, consumer profiling intelligence, KYC Automation & Compliance, healthcare, automated decisions, monitoring, claims processing, sentiment/psychology behaviour, auto insurance claims, travel insurance, disease prediction for insurance and more.   Job description We at artivatic are seeking for passionate, talented and research focused natural processing language engineer with strong machine learning and mathematics background to help build industry-leading technology. The ideal candidate will have research/implementation experience in modeling and developing NLP tools and have experience working with machine learning/deep learning algorithms. Roles and responsibilities Developing novel algorithms and modeling techniques to advance the state of the art in Natural Language Processing. Developing NLP based tools and solutions end to end. Working closely with R&D and Machine Learning engineers implementing algorithms that power user and developer-facing products.Be responsible for measuring and optimizing the quality of your algorithms Requirements Hands-on Experience building NLP models using different NLP libraries ad toolkit like NLTK, Stanford NLP etc Good understanding of Rule-based, Statistical and probabilistic NLP techniques. Good knowledge of NLP approaches and concepts like topic modeling, text summarization, semantic modeling, Named Entity recognition etc. Good understanding of Machine learning and Deep learning algorithms. Good knowledge of Data Structures and Algorithms. Strong programming skills in Python/Java/Scala/C/C++. Strong problem solving and logical skills. A go-getter kind of attitude with the willingness to learn new technologies. Well versed in software design paradigms and good development practices. Basic Qualifications Bachelors or Master degree in Computer Science, Mathematics or related field with specialization in natural language - Processing, Machine Learning or Deep Learning. Publication record in conferences/journals is a plus. 2+ years of working/research experience building NLP based solutions is preferred. If you feel that you are the ideal candidate & can bring a lot of values to our culture & company's vision, then please do apply. If your profile matches as per our requirements, you will hear from one of our team members. We are looking for someone who can be part of our Team not Employee. Job Perks Insurance, Travel compensation & others
Read more
company logo
Agency job
via UpgradeHR by Sangita Deka
Hyderabad
6 - 10 yrs
₹10L - ₹15L / yr
Big Data
skill iconData Science
skill iconMachine Learning (ML)
skill iconR Programming
skill iconPython
+2 more
It is one of the largest communication technology companies in the world. They operate America's largest 4G LTE wireless network and the nation's premiere all-fiber broadband network.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos