Cutshort logo
Molecular Connections logo
Data Engineer
Molecular Connections's logo

Data Engineer

Molecular Connections's profile picture
Posted by Molecular Connections
4 - 9 yrs
₹8L - ₹12L / yr
Bengaluru (Bangalore)
Skills
ETL
Informatica
Data Warehouse (DWH)
Spark
Hadoop
skill iconPython
Windows Azure
AWS
SQL
HiveQL

Job Description: Data Engineer


Experience: Over 4 years


Responsibilities:

-       Design, develop, and maintain scalable data pipelines for efficient data extraction, transformation, and loading (ETL) processes.

-       Architect and implement data storage solutions, including data warehouses, data lakes, and data marts, aligned with business needs.

-       Implement robust data quality checks and data cleansing techniques to ensure data accuracy and consistency.

-       Optimize data pipelines for performance, scalability, and cost-effectiveness.

-       Collaborate with data analysts and data scientists to understand data requirements and translate them into technical solutions.

-       Develop and maintain data security measures to ensure data privacy and regulatory compliance.

-       Automate data processing tasks using scripting languages (Python, Bash) and big data frameworks (Spark, Hadoop).

-       Monitor data pipelines and infrastructure for performance and troubleshoot any issues.

-       Stay up to date with the latest trends and technologies in data engineering, including cloud platforms (AWS, Azure, GCP).

-        Document data pipelines, processes, and data models for maintainability and knowledge sharing.

-       Contribute to the overall data governance strategy and best practices.

 

Qualifications:

-       Strong understanding of data architectures, data modelling principles, and ETL processes.

-       Proficiency in SQL (e.g., MySQL, PostgreSQL) and experience with big data querying languages (e.g., Hive, Spark SQL).

-       Experience with scripting languages (Python, Bash) for data manipulation and automation.

-       Experience with distributed data processing frameworks (Spark, Hadoop) (preferred).

-       Familiarity with cloud platforms (AWS, Azure, GCP) for data storage and processing (a plus).

-       Experience with data quality tools and techniques.

-       Excellent problem-solving, analytical, and critical thinking skills.

-       Strong communication, collaboration, and teamwork abilities.

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Molecular Connections

Founded :
2001
Type
Size :
1000-5000
Stage :
Profitable
About
N/A
Connect with the team
Profile picture
Molecular Connections
Profile picture
Chendil Kumar
Profile picture
Gurminder kaur
Profile picture
Lokanath Khamari
Company social profiles
N/A

Similar jobs

Wissen Technology
at Wissen Technology
4 recruiters
Sukanya Mohan
Posted by Sukanya Mohan
Bengaluru (Bangalore)
8 - 15 yrs
Best in industry
Snow flake schema
skill iconPython
PySpark
databricks

Responsibilities:

  • Lead the design, development, and implementation of scalable data architectures leveraging Snowflake, Python, PySpark, and Databricks.
  • Collaborate with business stakeholders to understand requirements and translate them into technical specifications and data models.
  • Architect and optimize data pipelines for performance, reliability, and efficiency.
  • Ensure data quality, integrity, and security across all data processes and systems.
  • Provide technical leadership and mentorship to junior team members.
  • Stay abreast of industry trends and best practices in data architecture and analytics.
  • Drive innovation and continuous improvement in data management practices.

Requirements:

  • Bachelor's degree in Computer Science, Information Systems, or a related field. Master's degree preferred.
  • 5+ years of experience in data architecture, data engineering, or a related field.
  • Strong proficiency in Snowflake, including data modeling, performance tuning, and administration.
  • Expertise in Python and PySpark for data processing, manipulation, and analysis.
  • Hands-on experience with Databricks for building and managing data pipelines.
  • Proven leadership experience, with the ability to lead cross-functional teams and drive projects to successful completion.
  • Experience in the banking or insurance domain is highly desirable.
  • Excellent communication skills, with the ability to effectively collaborate with stakeholders at all levels of the organization.
  • Strong problem-solving and analytical skills, with a keen attention to detail.

Benefits:

  • Competitive salary and performance-based incentives.
  • Comprehensive benefits package, including health insurance, retirement plans, and wellness programs.
  • Flexible work arrangements, including remote options.
  • Opportunities for professional development and career advancement.
  • Dynamic and collaborative work environment with a focus on innovation and continuous learning.


Read more
Carsome
at Carsome
3 recruiters
Piyush Palkar
Posted by Piyush Palkar
Kuala Lumpur
3 - 5 yrs
₹20L - ₹25L / yr
SQL
skill iconPython
Business Analysis
Statistical Modeling
MS-Office
+6 more

Your Day-to-Day

  1. Derive Insights and drive major strategic projects to improve Business Metrics and take responsibility for cost efficiency and Revenue management across the country
  2. Perform Market research, Post Mortem analyses on competitor expansion and Market Penetration patterns. 
  3. Provide in-depth business analysis and data insights for internal stakeholders to help improve business. Derive and launch projects in order to reduce the gaps between targeted and projected business metrics
  4. Responsible for optimizing Carsome’s C2B and B2C customer acquisition and Dealer retention funnel. Work closely with Marketing and Tech teams to create, produce and implement creative digital marketing campaigns and drive CRM initiatives and strategies 
  5. Analyse the Revenue flows and processes large datasets to gather process insights and propose process improvement ideas for Carsome across SE-Asia
  6. Lead commercial projects & process mapping, from conceptualization to completion, to build or re-engineer business models, tools and processes.
  7. Having experience in analyses and insights in dealing on Unit Economics, COGs and P&L will be preferred ,but not mandatory
  8. Use Business Intelligence and Data Science tools to answer the appropriate business problems using SQL, Tableau or Python.
  9. Coordinate with HQ Data Insights Team and manage internal stakeholders across departments to ensure the smooth delivery of strategic projects
  10. Work across different departments/functions (BI,DE, tech, pricing, finance, operations, marketing, CS,CX) and also on high impact projects and support business expansion initiatives





Your Know-Know


  • At least a Bachelor's Degree in Accounting/Finance/Business or the equivalent. 
  • 3-5  years of experience in strategy / consulting / analytical / project management roles; experience in e-commerce, Start-ups or Unicorns(CARS24,OLA,SWIGGY,FLIPKART,OYO) or entrepreneur experience preferred + At Least 2 years of experience leading a team
  • Top-notch academics from a Tier 1 college (IIM / IIT/ NIT)
  • Must have SQL/PostgreSQL/Tableau Experience. 
  • Excellent Market Research, reporting and analytical skills, including carrying out weekly and monthly reporting
  • Holds experience in working with Data/Business Intelligence Team
  • Analytical mindset with ability to present data in a structured and informative way
  • Enjoy a fast-paced environment and can align business objectives with product priorities
  • Good to have : Financial modelling, Developing financial forecasts , development of Financial - strategic plan/framework
Read more
Thoughtworks
at Thoughtworks
1 video
27 recruiters
Sunidhi Thakur
Posted by Sunidhi Thakur
Bengaluru (Bangalore)
10 - 13 yrs
Best in industry
Data modeling
PySpark
Data engineering
Big Data
Hadoop
+10 more

Lead Data Engineer

 

Data Engineers develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems. On other projects, you might be acting as the architect, leading the design of technical solutions, or perhaps overseeing a program inception to build a new product. It could also be a software delivery project where you're equally happy coding and tech-leading the team to implement the solution.

 

Job responsibilities

 

·      You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems

·      You will partner with teammates to create complex data processing pipelines in order to solve our clients' most ambitious challenges

·      You will collaborate with Data Scientists in order to design scalable implementations of their models

·      You will pair to write clean and iterative code based on TDD

·      Leverage various continuous delivery practices to deploy, support and operate data pipelines

·      Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available

·      Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions

·      Create data models and speak to the tradeoffs of different modeling approaches

·      On other projects, you might be acting as the architect, leading the design of technical solutions, or perhaps overseeing a program inception to build a new product

·      Seamlessly incorporate data quality into your day-to-day work as well as into the delivery process

·      Assure effective collaboration between Thoughtworks' and the client's teams, encouraging open communication and advocating for shared outcomes

 

Job qualifications Technical skills

·      You are equally happy coding and leading a team to implement a solution

·      You have a track record of innovation and expertise in Data Engineering

·      You're passionate about craftsmanship and have applied your expertise across a range of industries and organizations

·      You have a deep understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, and Hadoop

·      You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive, Oozie, and Airflow in a production setting

·      Hands on experience in MapR, Cloudera, Hortonworks and/or cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions

·      You are comfortable taking data-driven approaches and applying data security strategy to solve business problems

·      You're genuinely excited about data infrastructure and operations with a familiarity working in cloud environments

·      Working with data excites you: you have created Big data architecture, you can build and operate data pipelines, and maintain data storage, all within distributed systems

 

Professional skills


·      Advocate your data engineering expertise to the broader tech community outside of Thoughtworks, speaking at conferences and acting as a mentor for more junior-level data engineers

·      You're resilient and flexible in ambiguous situations and enjoy solving problems from technical and business perspectives

·      An interest in coaching others, sharing your experience and knowledge with teammates

·      You enjoy influencing others and always advocate for technical excellence while being open to change when needed

Read more
Mobile Programming LLC
at Mobile Programming LLC
1 video
34 recruiters
Sukhdeep Singh
Posted by Sukhdeep Singh
Bengaluru (Bangalore)
4 - 6 yrs
₹10L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
Snow flake schema
Snowflake
+5 more

Job Title: AWS-Azure Data Engineer with Snowflake

Location: Bangalore, India

Experience: 4+ years

Budget: 15 to 20 LPA

Notice Period: Immediate joiners or less than 15 days

Job Description:

We are seeking an experienced AWS-Azure Data Engineer with expertise in Snowflake to join our team in Bangalore. As a Data Engineer, you will be responsible for designing, implementing, and maintaining data infrastructure and systems using AWS, Azure, and Snowflake. Your primary focus will be on developing scalable and efficient data pipelines, optimizing data storage and processing, and ensuring the availability and reliability of data for analysis and reporting.

Responsibilities:

  1. Design, develop, and maintain data pipelines on AWS and Azure to ingest, process, and transform data from various sources.
  2. Optimize data storage and processing using cloud-native services and technologies such as AWS S3, AWS Glue, Azure Data Lake Storage, Azure Data Factory, etc.
  3. Implement and manage data warehouse solutions using Snowflake, including schema design, query optimization, and performance tuning.
  4. Collaborate with cross-functional teams to understand data requirements and translate them into scalable and efficient technical solutions.
  5. Ensure data quality and integrity by implementing data validation, cleansing, and transformation processes.
  6. Develop and maintain ETL processes for data integration and migration between different data sources and platforms.
  7. Implement and enforce data governance and security practices, including access control, encryption, and compliance with regulations.
  8. Collaborate with data scientists and analysts to support their data needs and enable advanced analytics and machine learning initiatives.
  9. Monitor and troubleshoot data pipelines and systems to identify and resolve performance issues or data inconsistencies.
  10. Stay updated with the latest advancements in cloud technologies, data engineering best practices, and emerging trends in the industry.

Requirements:

  1. Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
  2. Minimum of 4 years of experience as a Data Engineer, with a focus on AWS, Azure, and Snowflake.
  3. Strong proficiency in data modelling, ETL development, and data integration.
  4. Expertise in cloud platforms such as AWS and Azure, including hands-on experience with data storage and processing services.
  5. In-depth knowledge of Snowflake, including schema design, SQL optimization, and performance tuning.
  6. Experience with scripting languages such as Python or Java for data manipulation and automation tasks.
  7. Familiarity with data governance principles and security best practices.
  8. Strong problem-solving skills and ability to work independently in a fast-paced environment.
  9. Excellent communication and interpersonal skills to collaborate effectively with cross-functional teams and stakeholders.
  10. Immediate joiner or notice period less than 15 days preferred.

If you possess the required skills and are passionate about leveraging AWS, Azure, and Snowflake to build scalable data solutions, we invite you to apply. Please submit your resume and a cover letter highlighting your relevant experience and achievements in the AWS, Azure, and Snowflake domains.

Read more
Ascendeum
at Ascendeum
3 recruiters
Sonali Jain
Posted by Sonali Jain
Remote only
1 - 3 yrs
₹6L - ₹9L / yr
skill iconPython
CI/CD
Storage & Networking
Data storage
  • Understand long-term and short-term business requirements to precision match it with the capabilities of different distributed storage and computing technologies from the plethora of options available in the ecosystem.

  • Create complex data processing pipelines

  • Design scalable implementations of the models developed by our Data Scientist.

  • Deploy data pipelines in production systems based on CICD practices

  • Create and maintain clear documentation on data models/schemas as well as

    transformation/validation rules

  • Troubleshoot and remediate data quality issues raised by pipeline alerts or downstream consumers

Read more
Celebal Technologies Pvt Ltd
at Celebal Technologies Pvt Ltd
2 candid answers
Anjani Upadhyay
Posted by Anjani Upadhyay
Jaipur
3 - 6 yrs
₹10L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SQL
NOSQL Databases
+6 more

Job Description: 

An Azure Data Engineer is responsible for designing, implementing, and maintaining pipelines and ETL/ ELT flow solutions on the Azure cloud platform. This role requires a strong understanding of migration database technologies and the ability to deploy and manage database solutions in the Azure cloud environment.

 

Key Skills:

·      Min. 3+ years of Experience with data modeling, data warehousing, and building ETL pipelines.

·      Must have a firm knowledge of SQL, NoSQL, SSIS SSRS, and ETL/ELT Concepts.

·      Should have hands-on experience in Databricks, ADF (Azure Data Factory), ADLS, Cosmos DB.

·      Excel in the design, creation, and management of very large datasets

·      Detailed knowledge of cloud-based data warehouses, architecture, infrastructure components, ETL, and reporting analytics tools and environments.

·      Skilled with writing, tuning, and troubleshooting SQL queries

·      Experience with Big Data technologies such as Data storage, Data mining, Data analytics, and Data visualization.

·      Should be familiar with programming and should be able to write and debug the code in any of the programming languages like Node, Python, C#, .Net, Java.

 

Technical Expertise and Familiarity:

  • Cloud Technologies: Azure (ADF, ADB, Logic Apps, Azure SQL database, Azure Key Vaults, ADLS, Synapse)
  • Database: CosmosDB, Document DB  
  • IDEs: Visual Studio, VS Code, MS SQL Server
  • Data Modelling,ELT, ETL Methodology

 

 

 

 

 

 

Read more
Tredence
Suchismita Das
Posted by Suchismita Das
Bengaluru (Bangalore), Gurugram, Chennai, Pune
8 - 10 yrs
Best in industry
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
skill iconR Programming
SQL
+1 more

THE IDEAL CANDIDATE WILL

 

  • Engage with executive level stakeholders from client's team to translate business problems to high level solution approach
  • Partner closely with practice, and technical teams to craft well-structured comprehensive proposals/ RFP responses clearly highlighting Tredence’s competitive strengths relevant to Client's selection criteria
  • Actively explore the client’s business and formulate solution ideas that can improve process efficiency and cut cost, or achieve growth/revenue/profitability targets faster
  • Work hands-on across various MLOps problems and provide thought leadership
  • Grow and manage large teams with diverse skillsets
  • Collaborate, coach, and learn with a growing team of experienced Machine Learning Engineers and Data Scientists

 

 

 

ELIGIBILITY CRITERIA

 

  • BE/BTech/MTech (Specialization/courses in ML/DS)
  • At-least 7+ years of Consulting services delivery experience
  • Very strong problem-solving skills & work ethics
  • Possesses strong analytical/logical thinking, storyboarding and executive communication skills
  • 5+ years of experience in Python/R, SQL
  • 5+ years of experience in NLP algorithms, Regression & Classification Modelling, Time Series Forecasting
  • Hands on work experience in DevOps
  • Should have good knowledge in different deployment type like PaaS, SaaS, IaaS
  • Exposure on cloud technologies like Azure, AWS or GCP
  • Knowledge in python and packages for data analysis (scikit-learn, scipy, numpy, pandas, matplotlib).
  • Knowledge of Deep Learning frameworks: Keras, Tensorflow, PyTorch, etc
  • Experience with one or more Container-ecosystem (Docker, Kubernetes)
  • Experience in building orchestration pipeline to convert plain python models into a deployable API/RESTful endpoint.
  • Good understanding of OOP & Data Structures concepts

 

 

Nice to Have:

 

  • Exposure to deployment strategies like: Blue/Green, Canary, AB Testing, Multi-arm Bandit
  • Experience in Helm is a plus
  • Strong understanding of data infrastructure, data warehouse, or data engineering

 

You can expect to –

  • Work with world’ biggest retailers and help them solve some of their most critical problems. Tredence is a preferred analytics vendor for some of the largest Retailers across the globe
  • Create multi-million Dollar business opportunities by leveraging impact mindset, cutting edge solutions and industry best practices.
  • Work in a diverse environment that keeps evolving
  • Hone your entrepreneurial skills as you contribute to growth of the organization

 

 

Read more
Orboai
at Orboai
4 recruiters
Hardika Bhansali
Posted by Hardika Bhansali
Noida, Mumbai
1 - 3 yrs
₹6L - ₹15L / yr
TensorFlow
OpenCV
OCR
PyTorch
Keras
+10 more

Who Are We

 

A research-oriented company with expertise in computer vision and artificial intelligence, at its core, Orbo is a comprehensive platform of AI-based visual enhancement stack. This way, companies can find a suitable product as per their need where deep learning powered technology can automatically improve their Imagery.

 

ORBO's solutions are helping BFSI, beauty and personal care digital transformation and Ecommerce image retouching industries in multiple ways.

 

WHY US

  • Join top AI company
  • Grow with your best companions
  • Continuous pursuit of excellence, equality, respect
  • Competitive compensation and benefits

You'll be a part of the core team and will be working directly with the founders in building and iterating upon the core products that make cameras intelligent and images more informative.

 

To learn more about how we work, please check out

https://www.orbo.ai/.

 

Description:

We are looking for a computer vision engineer to lead our team in developing a factory floor analytics SaaS product. This would be a fast-paced role and the person will get an opportunity to develop an industrial grade solution from concept to deployment.

 

Responsibilities:

  • Research and develop computer vision solutions for industries (BFSI, Beauty and personal care, E-commerce, Defence etc.)
  • Lead a team of ML engineers in developing an industrial AI product from scratch
  • Setup end-end Deep Learning pipeline for data ingestion, preparation, model training, validation and deployment
  • Tune the models to achieve high accuracy rates and minimum latency
  • Deploying developed computer vision models on edge devices after optimization to meet customer requirements

 

 

Requirements:

  • Bachelor’s degree
  • Understanding about depth and breadth of computer vision and deep learning algorithms.
  • Experience in taking an AI product from scratch to commercial deployment.
  • Experience in Image enhancement, object detection, image segmentation, image classification algorithms
  • Experience in deployment with OpenVINO, ONNXruntime and TensorRT
  • Experience in deploying computer vision solutions on edge devices such as Intel Movidius and Nvidia Jetson
  • Experience with any machine/deep learning frameworks like Tensorflow, and PyTorch.
  • Proficient understanding of code versioning tools, such as Git

Our perfect candidate is someone that:

  • is proactive and an independent problem solver
  • is a constant learner. We are a fast growing start-up. We want you to grow with us!
  • is a team player and good communicator

 

What We Offer:

  • You will have fun working with a fast-paced team on a product that can impact the business model of E-commerce and BFSI industries. As the team is small, you will easily be able to see a direct impact of what you build on our customers (Trust us - it is extremely fulfilling!)
  • You will be in charge of what you build and be an integral part of the product development process
  • Technical and financial growth!
Read more
Mirafra Technologies
at Mirafra Technologies
4 recruiters
Nirmala N S
Posted by Nirmala N S
Remote, Bengaluru (Bangalore)
4 - 7 yrs
₹5L - ₹18L / yr
Big Data
skill iconScala
Spark
"spark streaming"
"Hadoop
Should have experience in Big data development
Strong experience in Scala/Spark

End client: Sapient
Mode of Hiring : FTE
Notice should be less than 30days
Read more
market-leading fintech company dedicated to providing credit
market-leading fintech company dedicated to providing credit
Agency job
via Talent Socio Bizcon LLP by Hema Latha N
Noida, NCR (Delhi | Gurgaon | Noida)
1 - 4 yrs
₹8L - ₹18L / yr
Analytics
Predictive analytics
Linear regression
Logistic regression
skill iconPython
+1 more
Job Description : Role : Analytics Scientist - Risk Analytics Experience Range : 1 to 4 Years Job Location : Noida Key responsibilities include •Building models to predict risk and other key metrics •Coming up with data driven solutions to control risk •Finding opportunities to acquire more customers by modifying/optimizing existing rules •Doing periodic upgrades of the underwriting strategy based on business requirements •Evaluating 3rd party solutions for predicting/controlling risk of the portfolio •Running periodic controlled tests to optimize underwriting •Monitoring key portfolio metrics and take data driven actions based on the performance Business Knowledge: Develop an understanding of the domain/function. Manage business process (es) in the work area. The individual is expected to develop domain expertise in his/her work area. Teamwork: Develop cross site relationships to enhance leverage of ideas. Set and manage partner expectations. Drive implementation of projects with Engineering team while partnering seamlessly with cross site team members Communication: Responsibly perform end to end project communication across the various levels in the organization. Candidate Specification: Skills: • Knowledge of analytical tool - R Language or Python • Established competency in Predictive Analytics (Logistic & Regression) • Experience in handling complex data sources •Dexterity with MySQL, MS Excel is good to have •Strong Analytical aptitude and logical reasoning ability •Strong presentation and communication skills Preferred: •1 - 3 years of experience in Financial Services/Analytics Industry •Understanding of the financial services business • Experience in working on advanced machine learning techniques If interested, please send your updated profile in word format with below details for further discussion at the earliest. 1. Current Company 2. Current Designation 3. Total Experience 4. Current CTC( Fixed & Variable) 5. Expected CTC 6. Notice Period 7. Current Location 8. Reason for Change 9. Availability for face to face interview on weekdays 10.Education Degreef the financial services business Thanks & Regards, Hema Talent Socio
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos