Data Engineer

at Healthifyme

DP
Posted by Jaya Harjai
icon
Bengaluru (Bangalore)
icon
3 - 4 yrs
icon
₹18L - ₹35L / yr
icon
Full time
Skills
Python
SQL
Data engineering
Big Data
Data Warehouse (DWH)
ETL
Apache Spark
flink

Responsibilities:

  • Design, construct, install, test and maintain data pipeline and data management systems.
  • Ensure that all systems meet the business/company requirements as well as industry practices.
  • Integrate up-and-coming data management and software engineering technologies into existing data structures.
  • Processes for data mining, data modeling, and data production.
  • Create custom software components and analytics applications.
  • Collaborate with members of your team (eg, Data Architects, the Software team, Data Scientists) on the project's goals.
  • Recommend different ways to constantly improve data reliability and quality.

 

Requirements:

  • Experience in a related field with real-world skills and testimonials from former employees.
  • Familiar with data warehouses like Redshift, Bigquery and Athena.
  • Familiar with data processing systems like flink, spark and storm. Develop set
  • Proficiency in Python and SQL. Possible work experience and proof of technical expertise.
  • You may also consider a Master's degree in computer engineering or science in order to fine-tune your skills while on the job. (Although a Master's isn't required, it is always appreciated).
  • Intellectual curiosity to find new and unusual ways of how to solve data management issues.
  • Ability to approach data organization challenges while keeping an eye on what's important.
  • Minimal data science knowledge is a Must, should understand a bit of analytics.
Read more

About Healthifyme

About Us


We were founded in 2012 by Tushar Vashisht and Sachin Shenoy, and incubated by Microsoft Accelerator.
Today, we happen to be India's largest and most loved health & fitness app with over 4 million users from
220+ cities in India. What makes us unique is our ability to bring together the power of artificial
intelligence powered technology and human empathy to deliver measurable impact in our customers'
lives. We do this through our team of elite nutritionists & trainers working together with the world's first
AI powered virtual nutritionist - "Ria", our proudest creation till date. Ria references data from over 200
million food & workout logs and 14 million conversations to deliver intelligent health & fitness suggestions
to our customers. Ria also happens to be multi-lingual, "she" understands English, French, German, Italian
& Hindi.

 

Recently Russia's Sistema and Samsung's AI focussed fund - NEXT, led a USD 12 Million Series B funding
into our business. We are the most liked app in India across categories, we've been consistently rated as
the no:1 health & fitness app on playstore for 3 years running and received Google's "editor's choice
award" in 2017. Some of the marquee corporates in the country such as Cognizant, Accenture, Deloitte,
Metlife amongst others have also benefited from our employee engagement and wellness programs. Our
global aspirations have taken us to MENA, SEA and LATAM regions with more markets to follow.

Company website www.healthifyme.com

Read more
Founded
2012
Type
Products & Services
Size
100-1000 employees
Stage
Raised funding
View full company details
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

data engineer

at Information Solution Provider Company

Agency job
via Jobdost
Spark
Scala
Hadoop
PySpark
Data engineering
Big Data
Machine Learning (ML)
icon
Delhi
icon
3 - 5 yrs
icon
₹3L - ₹10L / yr

Data Engineer 

Responsibilities:

 

  • Designing and implementing fine-tuned production ready data/ML pipelines in Hadoop platform.
  • Driving optimization, testing and tooling to improve quality.
  • Reviewing and approving high level & amp; detailed design to ensure that the solution delivers to the business needs and aligns to the data & analytics architecture principles and roadmap.
  • Understanding business requirements and solution design to develop and implement solutions that adhere to big data architectural guidelines and address business requirements.
  • Following proper SDLC (Code review, sprint process).
  • Identifying, designing, and implementing internal process improvements: automating manual processes, optimizing data delivery, etc.
  • Building robust and scalable data infrastructure (both batch processing and real-time) to support needs from internal and external users.
  • Understanding various data security standards and using secure data security tools to apply and adhere to the required data controls for user access in the Hadoop platform.
  • Supporting and contributing to development guidelines and standards for data ingestion.
  • Working with a data scientist and business analytics team to assist in data ingestion and data related technical issues.
  • Designing and documenting the development & deployment flow.

 

Requirements:

 

  • Experience in developing rest API services using one of the Scala frameworks.
  • Ability to troubleshoot and optimize complex queries on the Spark platform
  • Expert in building and optimizing ‘big data’ data/ML pipelines, architectures and data sets.
  • Knowledge in modelling unstructured to structured data design.
  • Experience in Big Data access and storage techniques.
  • Experience in doing cost estimation based on the design and development.
  • Excellent debugging skills for the technical stack mentioned above which even includes analyzing server logs and application logs.
  • Highly organized, self-motivated, proactive, and ability to propose best design solutions.
  • Good time management and multitasking skills to work to deadlines by working independently and as a part of a team.

 

Read more
Job posted by
Saida Jabbar

Python Developer

at Testbook

Founded 2013  •  Services  •  100-1000 employees  •  Raised funding
Python
pandas
Big Data
Google Cloud Platform (GCP)
Amazon Web Services (AWS)
icon
Navi Mumbai
icon
1 - 2 yrs
icon
₹1L - ₹5L / yr

About Us:-

The fastest rising startup in the EdTech space, focussed on Engineering and Government Job Exams and with an eye to capture UPSC, PSC, and international exams. Testbook is poised to revolutionize the industry. With a registered user base of over 2.2 Crore students, more than 450 crore questions solved on the WebApp, and a knockout Android App. Testbook has raced to the front and is ideally placed to capture bigger markets.

Testbook is the perfect incubator for talent. You come, you learn, you conquer. You train under the best mentors and become an expert in your field in your own right. That being said, the flexibility in the projects you choose, how and when you work on them, what you want to add to them is respected in this startup. You are the sole master of your work.

The IIT pedigree of the co-founders has attracted some of the brightest minds in the country to Testbook. A team that is quickly swelling in ranks, it now stands at 500+ in-house employees and hundreds of remote interns and freelancers. And the number is rocketing weekly. Now is the time to join the force.



In this role you will get to:-

  • Work with state-of-the-art data frameworks and technologies like Dataflow(Apache Beam), Dataproc(Apache Spark & Hadoop), Apache Kafka, Google PubSub, Apache Airflow, and others.
  • You will work cross-functionally with various teams, creating solutions that deal with large volumes of data.
  • You will work with the team to set and maintain standards and development practices.
  • You will be a keen advocate of quality and continuous improvement.
  • You will modernize the current data systems to develop Cloud-enabled Data and Analytics solutions
  • Drive the development of cloud-based data lake, hybrid data warehouses & business intelligence platforms
  • Improve upon the data ingestion models, ETL jobs, and alerts to maintain data integrity and data availability
  • Build Data Pipelines to ingest structured and Unstructured Data.
  • Gain hands-on experience with new data platforms and programming languages
  • Analyze and provide data-supported recommendations to improve product performance and customer acquisition
  • Design, Build and Support resilient production-grade applications and web services

 

Who you are:-

  • 1+ years of work experience in Software Engineering and development.
  • Very strong understanding of Python & pandas library.Good understanding of Scala, R, and other related languages
  • Experience with data transformation & data analytics in both batch & streaming mode using cloud-native technologies.
  • Strong experience with the big data technologies like Hadoop, Spark, BigQuery, DataProc, Dataflow
  • Strong analytical and communication skills.
  • Experience working with large, disconnected, and/or unstructured datasets.
  • Experience building and optimizing data pipelines, architectures, and data sets using cloud-native technologies.
  • Hands-on experience with any cloud tech like GCP/AWS is a plus.
Read more
Job posted by
Kush Semwal

GCP Developer

at Quess Corp Limited

Founded 2007  •  Products & Services  •  5000+ employees  •  Profitable
Google Cloud Platform (GCP)
Python
Big Data
Data processing
Data Visualization
icon
Noida, Delhi, Gurugram, Ghaziabad, Faridabad, Bengaluru (Bangalore), Chennai
icon
5 - 8 yrs
icon
₹1L - ₹15L / yr

GCP  Data Analyst profile must have below skills sets :

 

Read more
Job posted by
Anjali Singh

SQL Developer

at Fragma Data Systems

Founded 2015  •  Products & Services  •  employees  •  Profitable
Data Warehouse (DWH)
Informatica
ETL
SQL
SSIS
icon
Remote only
icon
5 - 7 yrs
icon
₹10L - ₹18L / yr
SQL Developer with Relevant experience of 7 Yrs with Strong Communication Skills.
 
Key responsibilities:
 
 
  • Creating, designing and developing data models
  • Prepare plans for all ETL (Extract/Transformation/Load) procedures and architectures
  • Validating results and creating business reports
  • Monitoring and tuning data loads and queries
  • Develop and prepare a schedule for a new data warehouse
  • Analyze large databases and recommend appropriate optimization for the same
  • Administer all requirements and design various functional specifications for data
  • Provide support to the Software Development Life cycle
  • Prepare various code designs and ensure efficient implementation of the same
  • Evaluate all codes and ensure the quality of all project deliverables
  • Monitor data warehouse work and provide subject matter expertise
  • Hands-on BI practices, data structures, data modeling, SQL skills
 
 

Experience
Experience Range

5 Years - 10 Years

Function Information Technology
Desired Skills
Must have Skills:  SQL

Hard Skills for a Data Warehouse Developer:
 
  • Hands-on experience with ETL tools e.g., DataStage, Informatica, Pentaho, Talend
  • Sound knowledge of SQL
  • Experience with SQL databases such as Oracle, DB2, and SQL
  • Experience using Data Warehouse platforms e.g., SAP, Birst
  • Experience designing, developing, and implementing Data Warehouse solutions
  • Project management and system development methodology
  • Ability to proactively research solutions and best practice
 
Soft Skills for Data Warehouse Developers:
 
  • Excellent Analytical skills
  • Excellent verbal and written communications
  • Strong organization skills
  • Ability to work on a team, as well as independently
Read more
Job posted by
Sandhya JD

Data Engineer For Python

at A2Tech Consultants

Founded  •   •  employees  • 
Data engineering
Data Engineer
ETL
Spark
Apache Kafka
Big Data
Python
PySpark
Object Oriented Programming (OOPs)
Elastic Search
icon
Pune
icon
4 - 12 yrs
icon
₹6L - ₹15L / yr
We are looking for a smart candidate with:
  • Strong Python Coding skills and OOP skills
  • Should have worked on Big Data product Architecture
  • Should have worked with any one of the SQL-based databases like MySQL, PostgreSQL and any one of
  • NoSQL-based databases such as Cassandra, Elasticsearch etc.
  • Hands on experience on frameworks like Spark RDD, DataFrame, Dataset
  • Experience on development of ETL for data product
  • Candidate should have working knowledge on performance optimization, optimal resource utilization, Parallelism and tuning of spark jobs
  • Working knowledge on file formats: CSV, JSON, XML, PARQUET, ORC, AVRO
  • Good to have working knowledge with any one of the Analytical Databases like Druid, MongoDB, Apache Hive etc.
  • Experience to handle real-time data feeds (good to have working knowledge on Apache Kafka or similar tool)
Key Skills:
  • Python and Scala (Optional), Spark / PySpark, Parallel programming
Read more
Job posted by
Dhaval B

Data Engineer

at Service based company

Big Data
Apache Kafka
Data engineering
Cassandra
Java
Scala
icon
Pune
icon
6 - 12 yrs
icon
₹6L - ₹28L / yr

Primary responsibilities:

  • Architect, Design and Build high performance Search systems for personalization, optimization, and targeting
  • Designing systems with Solr, Akka, Cassandra, Kafka
  • Algorithmic development with primary focus Machine Learning
  • Working with rapid and innovative development methodologies like: Kanban, Continuous Integration and Daily deployments
  • Participation in design and code reviews and recommend improvements
  • Unit testing with JUnit, Performance testing and tuning
  • Coordination with internal and external teams
  • Mentoring junior engineers
  • Participate in Product roadmap and Prioritization discussions and decisions
  • Evangelize the solution with Professional services and Customer Success teams

 

Read more
Job posted by
Rohini Shinde

Principal Engineer - Java+Scala+AWS

at Company is into Product Development.

Scala
Big Data
Java
Amazon Web Services (AWS)
ETL
icon
Remote, Mumbai
icon
10 - 18 yrs
icon
₹30L - ₹55L / yr

What's the role?

Your role as a Principal Engineer will involve working with various team. As a principal engineer, will need full knowledge of the software development lifecycle and Agile methodologies. You will demonstrate multi-tasking skills under tight deadlines and constraints. You will regularly contribute to the development of work products (including analyzing, designing, programming, debugging, and documenting software) and may work with customers to resolve challenges and respond to suggestions for improvements and enhancements. You will setup the standard and principal for the product he/she drives.

  • Setup coding practice, guidelines & quality of the software delivered.
  • Determines operational feasibility by evaluating analysis, problem definition, requirements, solution development, and proposed solutions.
  • Documents and demonstrates solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code.
  • Prepares and installs solutions by determining and designing system specifications, standards, and programming.
  • Improves operations by conducting systems analysis; recommending changes in policies and procedures.
  • Updates job knowledge by studying state-of-the-art development tools, programming techniques, and computing equipment; participating in educational opportunities; reading professional publications; maintaining personal networks; participating in professional organizations.
  • Protects operations by keeping information confidential.
  • Develops software solutions by studying information needs; conferring with users; studying systems flow, data usage, and work processes; investigating problem areas; following the software development lifecycle. Who are you? You are a go-getter, with an eye for detail, strong problem-solving and debugging skills, and having a degree in BE/MCA/M.E./ M Tech degree or equivalent degree from reputed college/university.

 

Essential Skills / Experience:

  • 10+ years of engineering experience
  • Experience in designing and developing high volume web-services using API protocols and data formats
  • Proficient in API modelling languages and annotation
  • Proficient in Java programming
  • Experience with Scala programming
  • Experience with ETL systems
  • Experience with Agile methodologies
  • Experience with Cloud service & storage
  • Proficient in Unix/Linux operating systems
  • Excellent oral and written communication skills Preferred:
  • Functional programming languages (Scala, etc)
  • Scripting languages (bash, Perl, Python, etc)
  • Amazon Web Services (Redshift, ECS etc)
Read more
Job posted by
Dnyanesh Panchal

ETL Engineer - Data Pipeline

at DataToBiz

Founded 2018  •  Services  •  20-100 employees  •  Bootstrapped
ETL
Amazon Web Services (AWS)
Amazon Redshift
Python
icon
Chandigarh, NCR (Delhi | Gurgaon | Noida)
icon
2 - 6 yrs
icon
₹7L - ₹15L / yr
Job Responsibilities : - Developing new data pipelines and ETL jobs for processing millions of records and it should be scalable with growth.
Pipelines should be optimised to handle both real time data, batch update data and historical data.
Establish scalable, efficient, automated processes for complex, large scale data analysis.
Write high quality code to gather and manage large data sets (both real time and batch data) from multiple sources, perform ETL and store it in a data warehouse.
Manipulate and analyse complex, high-volume, high-dimensional data from varying sources using a variety of tools and data analysis techniques.
Participate in data pipelines health monitoring and performance optimisations as well as quality documentation.
Interact with end users/clients and translate business language into technical requirements.
Acts independently to expose and resolve problems.

Job Requirements :-
2+ years experience working in software development & data pipeline development for enterprise analytics.
2+ years of working with Python with exposure to various warehousing tools
In-depth working with any of commercial tools like AWS Glue, Ta-lend, Informatica, Data-stage, etc.
Experience with various relational databases like MySQL, MSSql, Oracle etc. is a must.
Experience with analytics and reporting tools (Tableau, Power BI, SSRS, SSAS).
Experience in various DevOps practices helping the client to deploy and scale the systems as per requirement.
Strong verbal and written communication skills with other developers and business client.
Knowledge of Logistics and/or Transportation Domain is a plus.
Hands-on with traditional databases and ERP systems like Sybase and People-soft.
Read more
Job posted by
PS Dhillon

Principal Architect – Big Data Security Architecture

at IGT Solutions

Founded 1998  •  Products & Services  •  100-1000 employees  •  Profitable
Big Data
Security architecture
Cyber Security
icon
Dubai, Anywhere
icon
12 - 18 yrs
icon
₹50L - ₹70L / yr
• Design secure solutions in line with the business strategy and security requirements • Contribute to the enterprise security architecture through developing Strategies, Reference Architectures, Roadmaps, Architectural Principles, Technology Standards, Security Non-Functional Requirements, Architectural Decisions and Design Patterns. • Deliver cyber security architectural artifacts such as High Level Designs and Solution Blueprints. • Ensure the enforcement of security requirements in solution architecture • Contribute to educating other architects and engineering teams in designing and implementing secure solutions Technologies The candidate should have knowledge and experience in designing and implementing the following technologies and related domains • Cloud security • Identity and Access Management • Encryption, Masking and Key Management • Data Classification, Data Privacy and Data Leakage Prevention • Infrastructure security (Network/Servers/Virtualization) • Application Security • Endpoint Security • SIEM and Log Management • Forward and Reverse Proxy • Big Data Security • IoT Security • SAP Security (Preferred) Architecture Skills • Solid experience in developing security solution architecture • Solid experience and knowledge in TOGAF and SABSA or other Enterprise Architecture frameworks. • Strong experience in developing architectural artifacts including reference architectures, roadmaps, architectural principles, technology standards, security non-functional requirements, architectural decisions and design patterns • Strong experience in documenting existing, transition and target architectures. Cyber Security Skills • Solid experience in performing security risk assessments and controls implementation • Strong experience in designing and implementing security controls by utilizing the technologies mentioned in the technologies section above • Strong knowledge in offensive and defensive aspects of cybesecurity with solid understanding of attack techniques and abuse cases. • Strong knowledge and implementation experience of cyber security standards, frameworks and regulations such as ISO27001, NIST CSF, CSA CCM, PCI-DSS, GDPR
Read more
Job posted by
Apoorva Chauhan

Assistant Manager - Analytics - Product Team

at LatentView Analytics

Founded 2006  •  Products & Services  •  100-1000 employees  •  Profitable
Data Science
Analytics
Data Analytics
Data modeling
Data mining
Big Data
Python
Data Visualization
Natural Language Processing (NLP)
Apache Hadoop
Tableau
D3.js
icon
Chennai
icon
5 - 8 yrs
icon
₹5L - ₹8L / yr
Job Overview :We are looking for an experienced Data Science professional to join our Product team and lead the data analytics team and manage the processes and people responsible for accurate data collection, processing, modelling and analysis. The ideal candidate has a knack for seeing solutions in sprawling data sets and the business mindset to convert insights into strategic opportunities for our clients. The incumbent will work closely with leaders across product, sales, and marketing to support and implement high-quality, data-driven decisions. They will ensure data accuracy and consistent reporting by designing and creating optimal processes and procedures for analytics employees to follow. They will use advanced data modelling, predictive modelling, natural language processing and analytical techniques to interpret key findings.Responsibilities for Analytics Manager :- Build, develop and maintain data models, reporting systems, data automation systems, dashboards and performance metrics support that support key business decisions.- Design and build technical processes to address business issues.- Manage and optimize processes for data intake, validation, mining and engineering as well as modelling, visualization and communication deliverables.- Examine, interpret and report results to stakeholders in leadership, technology, sales, marketing and product teams.- Develop and implement quality controls and standards to ensure quality standards- Anticipate future demands of initiatives related to people, technology, budget and business within your department and design/implement solutions to meet these needs.- Communicate results and business impacts of insight initiatives to stakeholders within and outside of the company.- Lead cross-functional projects using advanced data modelling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities.Qualifications for Analytics Manager :- Working knowledge of data mining principles: predictive analytics, mapping, collecting data from multiple cloud-based data sources- Strong SQL skills, ability to perform effective querying- Understanding of and experience using analytical concepts and statistical techniques: hypothesis development, designing tests/experiments, analysing data, drawing conclusions, and developing actionable recommendations for business units.- Experience and knowledge of statistical modelling techniques: GLM multiple regression, logistic regression, log-linear regression, variable selection, etc.- Experience working with and creating databases and dashboards using all relevant data to inform decisions.- Strong problem solving, quantitative and analytical abilities.- Strong ability to plan and manage numerous processes, people and projects simultaneously.- Excellent communication, collaboration and delegation skills.- We- re looking for someone with at least 5 years of experience in a position monitoring, managing and drawing insights from data, and someone with at least 3 years of experience leading a team. The right candidate will also be proficient and experienced with the following tools/programs :- Strong programming skills with querying languages: R, Python etc.- Experience with big data tools like Hadoop- Experience with data visualization tools: Tableau, d3.js, etc.- Experience with Excel, Word, and PowerPoint.
Read more
Job posted by
Kannikanti madhuri
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at Healthifyme?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort