Cutshort logo
DataMetica logo
Big Data Engineer
Big Data Engineer
DataMetica's logo

Big Data Engineer

Nikita Aher's profile picture
Posted by Nikita Aher
2.5 - 6 yrs
₹1L - ₹8L / yr
Pune
Skills
Big Data
Hadoop
Apache Hive
Spark
Data engineering
Pig
Data Warehouse (DWH)
SQL
Job Title/Designation: Big Data Engineers - Hadoop, Pig, Hive, Spark
Employment Type: Full Time, Permanent

Job Description:
 
Work Location - Pune
Work Experience - 2.5 to 6 Years
 
Note - Candidates with short notice periods will be given preference.
 
Mandatory Skills:
  • Working knowledge and hands-on experience of Big Data / Hadoop tools and technologies.
  • Experience of working in Pig, Hive, Flume, Sqoop, Kafka etc.
  • Database development experience with a solid understanding of core database concepts, relational database design, ODS & DWH.
  • Expert level knowledge of SQL and scripting preferably UNIX shell scripting, Perl scripting.
  • Working knowledge of Data integration solution and well-versed with any ETL tool (Informatica / Datastage / Abinitio/Pentaho etc).
  • Strong problem solving and logical reasoning ability.
  • Excellent understanding of all aspects of the Software Development Lifecycle.
  • Excellent written and verbal communication skills.
  • Experience in Java will be an added advantage
  • Knowledge of object oriented programming concepts
  • Exposure to ISMS policies and procedures.
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About DataMetica

Founded :
2013
Type
Size :
100-1000
Stage :
Profitable
About
As a global leader in Data Warehouse Migration, Data Modernization, and Data Analytics, we empower businesses through automation and help you attain excellence. Our Belief is to Empowering companies to master their businesses and helping them achieve their full potential, we nurture clients with our innovative frameworks. Our embedded values help us strengthen the bond with our clients, ensuring growth for all. Datametica is a preferred partner with leading cloud vendors. We offer solutions related to migration from current Enterprise Data Warehouses to the Cloud determining which of these is best suited to your needs. We are giving Data Wings.
Read more
Company video
DataMetica's video section
DataMetica's video section
Photos
Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures
Connect with the team
Profile picture
Sumangali Desai
Profile picture
Shivani Mahale
Profile picture
Nitish Saxena
Profile picture
Nikita Aher
Profile picture
Pooja Gaikwad
Profile picture
Sayali Kachi
Profile picture
syed raza
Company social profiles
bloglinkedintwitterfacebook

Similar jobs

Media.net
at Media.net
21 recruiters
Akshata  Kulkarni
Posted by Akshata Kulkarni
Mumbai
0 - 0 yrs
₹2L - ₹3.5L / yr
SQL
MS-Excel
Communication Skills
SAS
SPSS

Kindly note that candidates who have graduated in 2022 and 2023 only will be considered for the role who are based in Mumbai, immediate joiners



JD - Data Operations Analyst


What is the job and team like

  • As a Data Operations Analyst we manage business reporting of numerous teams, constantly monitor performance
  • Checking the integrity of the revenue reporting done by the different systems for the correct profitability to be
  • reported to the CXOs
  • Send reports periodically and alert stakeholders for changes in the key performance metrics
  • Allocate efforts to different Business Implementations that help build Profit/Loss statement for the Financials.
  • Track crucial data points which affect the core of the business and escalate it to senior stakeholders


Roles and Responsibilities


  • Graduate in IT Background(BE/BSc IT/ BCA) 2022 and 2023 graduates only
  • Executing a set of business processes daily/weekly/monthly as per Business requirement.
  • Provide ad-hoc data support on any urgent reports and material in an expedited manner
  • Maintain a list of open tasks and escalations, and send updates to the relevant stakeholders
  • Have an eye for detail, should have the ability to look at numbers, spot trends and identify gaps
  • Identify efficient and meaningful ways to communicate data and analysis through ongoing reports and dashboards
  • Proficiency in SQL, Excel and any statistical and analytical tools such as SAS, SPSS is a big plus
  • Managing master data, including creation, updates, and deletion.
  • Ability to work in a fast paced, technical, cross functional environment
  • Familiarity with Internet Industry and Online Advertising Business is a plus


Ideal candidate


  • Import and export large volume of data to database tables as required
  • Should be able to write Data Definition Language or Data Manipulation Language SQL commands
  • Develop programs, methodologies to get analyzable data on a regular basis
  • Good team player and multi-tasker
  • Should have the ability to learn and adapt to change
  • Self-starter Must be productive with minimal direction
  • High-level written and verbal communication sk


Job Details

Work mode- In office

Must have skills - SQL, MS Excel, Communications


Read more
Graasai
Vineet A
Posted by Vineet A
Pune
3 - 7 yrs
₹10L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+9 more

Graas uses predictive AI to turbo-charge growth for eCommerce businesses. We are “Growth-as-a-Service”. Graas is a technology solution provider using predictive AI to turbo-charge growth for eCommerce businesses. Graas integrates traditional data silos and applies a machine-learning AI engine, acting as an in-house data scientist to predict trends and give real-time insights and actionable recommendations for brands. The platform can also turn insights into action by seamlessly executing these recommendations across marketplace store fronts, brand.coms, social and conversational commerce, performance marketing, inventory management, warehousing, and last mile logistics - all of which impacts a brand’s bottom line, driving profitable growth.


Roles & Responsibilities:

Work on implementation of real-time and batch data pipelines for disparate data sources.

  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies.
  • Build and maintain an analytics layer that utilizes the underlying data to generate dashboards and provide actionable insights.
  • Identify improvement areas in the current data system and implement optimizations.
  • Work on specific areas of data governance including metadata management and data quality management.
  • Participate in discussions with Product Management and Business stakeholders to understand functional requirements and interact with other cross-functional teams as needed to develop, test, and release features.
  • Develop Proof-of-Concepts to validate new technology solutions or advancements.
  • Work in an Agile Scrum team and help with planning, scoping and creation of technical solutions for the new product capabilities, through to continuous delivery to production.
  • Work on building intelligent systems using various AI/ML algorithms. 

 

Desired Experience/Skill:

 

  • Must have worked on Analytics Applications involving Data Lakes, Data Warehouses and Reporting Implementations.
  • Experience with private and public cloud architectures with pros/cons.
  • Ability to write robust code in Python and SQL for data processing. Experience in libraries such as Pandas is a must; knowledge of one of the frameworks such as Django or Flask is a plus.
  • Experience in implementing data processing pipelines using AWS services: Kinesis, Lambda, Redshift/Snowflake, RDS.
  • Knowledge of Kafka, Redis is preferred
  • Experience on design and implementation of real-time and batch pipelines. Knowledge of Airflow is preferred.
  • Familiarity with machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn)
Read more
A Product Company
Bengaluru (Bangalore)
3 - 6 yrs
₹15L - ₹26L / yr
Looker
Big Data
Hadoop
Spark
Apache Hive
+4 more
Job Title: Senior Data Engineer/Analyst
Location: Bengaluru
Department: - Engineering 

Bidgely is looking for extraordinary and dynamic Senior Data Analyst to be part of its core team in Bangalore. You must have delivered exceptionally high quality robust products dealing with large data. Be part of a highly energetic and innovative team that believes nothing is impossible with some creativity and hard work. 

Responsibilities 
● Design and implement a high volume data analytics pipeline in Looker for Bidgely flagship product.
●  Implement data pipeline in Bidgely Data Lake
● Collaborate with product management and engineering teams to elicit & understand their requirements & challenges and develop potential solutions 
● Stay current with the latest tools, technology ideas and methodologies; share knowledge by clearly articulating results and ideas to key decision makers. 

Requirements 
● 3-5 years of strong experience in data analytics and in developing data pipelines. 
● Very good expertise in Looker 
● Strong in data modeling, developing SQL queries and optimizing queries. 
● Good knowledge of data warehouse (Amazon Redshift, BigQuery, Snowflake, Hive). 
● Good understanding of Big data applications (Hadoop, Spark, Hive, Airflow, S3, Cloudera) 
● Attention to details. Strong communication and collaboration skills.
● BS/MS in Computer Science or equivalent from premier institutes.
Read more
Mobile Programming India Pvt Ltd
at Mobile Programming India Pvt Ltd
1 video
17 recruiters
Inderjit Kaur
Posted by Inderjit Kaur
Bengaluru (Bangalore), Chennai, Pune, Gurugram
4 - 8 yrs
₹9L - ₹14L / yr
ETL
Data Warehouse (DWH)
Data engineering
Data modeling
BRIEF JOB RESPONSIBILITIES:

• Responsible for designing, deploying, and maintaining analytics environment that processes data at scale
• Contribute design, configuration, deployment, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, enrichment, and loading of data into a variety of cloud data platforms, including AWS and Microsoft Azure
• Identify gaps and improve the existing platform to improve quality, robustness, maintainability, and speed
• Evaluate new and upcoming big data solutions and make recommendations for adoption to extend our platform to meet advanced analytics use cases, such as predictive modeling and recommendation engines
• Data Modelling , Data Warehousing on Cloud Scale using Cloud native solutions.
• Perform development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions

COMPETENCIES
• Experience building, maintaining, and improving Data Models / Processing Pipeline / routing in large scale environments
• Fluency in common query languages, API development, data transformation, and integration of data streams
• Strong experience with large dataset platforms such as (e.g. Amazon EMR, Amazon Redshift, AWS Lambda & Fargate, Amazon Athena, Azure SQL Database, Azure Database for PostgreSQL, Azure Cosmos DB , Data Bricks)
• Fluency in multiple programming languages, such as Python, Shell Scripting, SQL, Java, or similar languages and tools appropriate for large scale data processing.
• Experience with acquiring data from varied sources such as: API, data queues, flat-file, remote databases
• Understanding of traditional Data Warehouse components (e.g. ETL, Business Intelligence Tools)
• Creativity to go beyond current tools to deliver the best solution to the problem
Read more
High-Growth Fintech Startup
Agency job
via Unnati by Ramya Senthilnathan
Remote, Mumbai
3 - 5 yrs
₹7L - ₹10L / yr
Business Intelligence (BI)
PowerBI
Analytics
Reporting
Data management
+5 more
Want to join the trailblazing Fintech company which is leveraging software and technology to change the face of short-term financing in India!

Our client is an innovative Fintech company that is revolutionizing the business of short term finance. The company is an online lending startup that is driven by an app-enabled technology platform to solve the funding challenges of SMEs by offering quick-turnaround, paperless business loans without collateral. It counts over 2 million small businesses across 18 cities and towns as its customers. Its founders are IIT and ISB alumni with deep experience in the fin-tech industry, from earlier working with organizations like Axis Bank, Aditya Birla Group, Fractal Analytics, and Housing.com. It has raised funds of Rs. 100 Crore from finance industry stalwarts and is growing by leaps and bounds.
 
As a Data Analyst - SQL, you will be working on projects in the Analytics function to generate insights for business as well as manage reporting for the management for all things related to Lending.
 
You will be part of a rapidly growing tech-driven organization and will be responsible for generating insights that will drive business impact and productivity improvements.
 
What you will do:
  • Ensuring ease of data availability, with relevant dimensions, using Business Intelligence tools.
  • Providing strong reporting and analytical information support to the management team.
  • Transforming raw data into essential metrics basis needs of relevant stakeholders.
  • Performing data analysis for generating reports on a periodic basis.
  • Converting essential data into easy to reference visuals using Data Visualization tools (PowerBI, Metabase).
  • Providing recommendations to update current MIS to improve reporting efficiency and consistency.
  • Bringing fresh ideas to the table and keen observers of trends in the analytics and financial services industry.

 

 

What you need to have:
  • MBA/ BE/ Graduate, with work experience of 3+ years.
  • B.Tech /B.E.; MBA / PGDM
  • Experience in Reporting, Data Management (SQL, MongoDB), Visualization (PowerBI, Metabase, Data studio)
  • Work experience (into financial services, Indian Banks/ NBFCs in-house analytics units or Fintech/ analytics start-ups would be a plus.)
Skills:
  • Skilled at writing & optimizing large complicated SQL queries & MongoDB scripts.
  • Strong knowledge of Banking/ Financial Services domain
  • Experience with some of the modern relational databases
  • Ability to work on multiple projects of different nature and self- driven,
  • Liaise with cross-functional teams to resolve data issues and build strong reports

 

Read more
MNC
at MNC
Agency job
via Fragma Data Systems by geeti gaurav mohanty
Bengaluru (Bangalore)
2 - 5 yrs
₹7L - ₹12L / yr
Spark
skill iconPython
SQL
Primary Responsibilities:
• Responsible for developing and maintaining applications with PySpark
• Contribute to the overall design and architecture of the application developed and deployed.
• Performance Tuning wrt to executor sizing and other environmental parameters, code optimization, partitions tuning, etc.
• Interact with business users to understand requirements and troubleshoot issues.
• Implement Projects based on functional specifications.


Must-Have Skills:

• Good experience in Pyspark - Including Dataframe core functions and Spark SQL
• Good customer communication.
• Good Analytical skills
Read more
Spoonshot Inc.
at Spoonshot Inc.
3 recruiters
Rajesh Bhutada
Posted by Rajesh Bhutada
Bengaluru (Bangalore)
1 - 4 yrs
₹9L - ₹15L / yr
skill iconData Analytics
Data Visualization
Analytics
SQLite
PowerBI
+5 more
- Prior experience in Business Analytics and knowledge of related analysis or visualization tools
- Expecting a minimum of 2-4 years of relevant experience
- You will be managing a team of 3 currently
- Take up the ownership of developing and managing one of the largest and richest food (recipe, menu, and CPG) databases
- Interactions with cross-functional teams (Business, Food Science, Product, and Tech) on a regular basis to pan the future of client and internal food data management
- Should have a natural flair for playing with numbers and data and have a keen eye for detail and quality
- Will spearhead the Ops team in achieving the targets while maintaining a staunch attentiveness to Coverage, Completeness, and Quality of the data
- Shall program and manage projects while identifying opportunities to optimize costs and processes.
- Good business acumen, in creating logic & process flows, quick and smart decision-making skills are expected
- Will also be responsible for the recruitment, induction and training new members as well
- Setting competitive team targets. Guide and support the team members to go the extra mile and achieve set targets


Added Advantages :
- Experience in a Food Sector / Insights company
- Has a passion for exploring different cuisines
- Understands industry-related jargons and has a natural flair towards learning more about anything related to food
Read more
Elucidata Corporation
at Elucidata Corporation
3 recruiters
Bhuvnesh Sharma
Posted by Bhuvnesh Sharma
Remote, NCR (Delhi | Gurgaon | Noida)
4 - 6 yrs
₹15L - ₹20L / yr
Big Data
skill iconJavascript
skill iconAngularJS (1.x)
skill iconReact.js
About Elucidata:Our mission is to make data-driven understanding of disease, the default starting point in the drug discovery process. Our products & services further the understanding of the ways in which diseased cells are different from healthy ones. This understanding helps scientists discover new drugs in a more effective manner and complements the move towards personalization.Biological big data will outpace data generated by YouTube and Twitter by 10x in the next 7 yrs. Our platform Polly will enable scientists to process different kinds of biological data and generate insights from them to accelerate drug discovery. Polly is already being used at premier biopharma companies like Pfizer and Agios; and academic labs at Yale, MIT, Washington University.We are looking for teammates who think out-of-the-box and are not satisfied with quick fixes or canned solutions to our industry’s most challenging problems. If you seek an intellectually stimulating environment where you can have a major impact on a critically important industry, we’d like to talk to you.About RoleWe are looking for engineers who want to build data rich applications and love the end-to-end product journey from understanding customer needs to the final product.Key Responsibilities- Developing web applications to visualize and process scientific data. - Interacting with Product, Design and Engineering teams to spec, build, test and deploy new features. - Understanding user needs and the science behind it.- Mentoring junior developersRequirements- Minimum 3-4 years of experience working in web development- In-depth knowledge of JavaScript- Hands-on experience with modern frameworks (Angular, React) - Sound programming and computer science fundamentals- Good understanding of web architecture and single page applications You might be a great cultural fit for Elucidata if..- You are passionate for Science.- You are a self-learner who wants to keep learning everyday. - You regard your code as your craft that you want to keep honing. - You like to work hard to solve big challenges and enjoy the process of breaking down a problem one blow at a time. - You love science and can't stop being the geek at a party. Of course you party harder than everybody else there.
Read more
Mintifi
at Mintifi
3 recruiters
Suchita Upadhyay
Posted by Suchita Upadhyay
Mumbai
2 - 4 yrs
₹6L - ₹15L / yr
Big Data
Hadoop
MySQL
skill iconMongoDB
YARN
Job Title: Software Developer – Big Data Responsibilities We are looking for a Big Data Developer who can drive innovation and take ownership and deliver results. • Understand business requirements from stakeholders • Build & own Mintifi Big Data applications • Be heavily involved in every step of the product development process, from ideation to implementation to release. • Design and build systems with automated instrumentation and monitoring • Write unit & integration tests • Collaborate with cross functional teams to validate and get feedback on the efficacy of results created by the big data applications. Use the feedback to improve the business logic • Proactive approach to turn ambiguous problem spaces into clear design solutions. Qualifications • Hands-on programming skills in Apache Spark using Java or Scala • Good understanding about Data Structures and Algorithms • Good understanding about relational and non-relational database concepts (MySQL, Hadoop, MongoDB) • Experience in Hadoop ecosystem components like YARN, Zookeeper would be a strong plus
Read more
UpX Academy
at UpX Academy
2 recruiters
Suchit Majumdar
Posted by Suchit Majumdar
Noida, Hyderabad, NCR (Delhi | Gurgaon | Noida)
2 - 6 yrs
₹4L - ₹12L / yr
Spark
Hadoop
skill iconMongoDB
skill iconPython
skill iconScala
+3 more
Looking for a technically sound and excellent trainer on big data technologies. Get an opportunity to become popular in the industry and get visibility. Host regular sessions on Big data related technologies and get paid to learn.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos