Cutshort logo
Data Engineer

Data Engineer

at Recro

DP
Posted by Alok Singh
icon
Remote only
icon
1 - 2 yrs
icon
₹6L - ₹8L / yr
icon
Full time
Skills
SQL
Python
Go Programming (Golang)
PostgreSQL
RabbitMQ
Big Data
Systems design
ETL
RESTful APIs
Tableau
PowerBI
Amazon Web Services (AWS)
Google Cloud Platform (GCP)
Apache Kafka
Data Scraping
Experience: 2 to 5 Years.
Skills: SQL, DBT, Airflow, Python, Golang, PostgreSQL, Clickhouse, BigQuery, Kafka/RabbitMQ, Data Scraping, System Design, ETLs, REST.
Work Location: Gurgaon/Bangalore/Remote.

We are seeking an experienced Analytics Engineer to join our team. The ideal candidate will have a strong background in Python, SQL, dbt, Airflow, and core database concepts. As an Analytics Engineer, you will be responsible for managing our data infrastructure, which includes building and maintaining robust API integrations with third-party data providers, designing and implementing data pipelines that run on schedule, and working with business, product, and engineering teams to deliver high-quality data products.

Key Responsibilities:
- Design, build, and maintain robust API integrations with third-party data providers.
- Develop and maintain data pipelines using Python, dbt, and Airflow.
- Collaborate with business, product, and engineering teams to deliver high-quality data products.
- Monitor and optimize data pipelines to ensure they run on schedule and with high performance.
- Stay up-to-date with the latest developments in data infrastructure and analytics technologies.
- Troubleshoot and resolve data pipeline and integration issues as they arise.

Qualifications:
- Strong experience with SQL, dbt, Airflow, and core database concepts.
- Experience with building and maintaining API integrations with third-party data providers. 
- Experience with designing and implementing data pipelines.
- Strong problem-solving and analytical skills.
- Excellent communication and collaboration skills.
- Experience with big data warehouses like BigQuery, Redshift, or Clickhouse.
- Experience with data visualization and analysis tools such as Metabase, Tableau or Power BI is a plus.
- Familiarity with cloud platforms such as AWS or GCP is a plus.

If you are passionate about data infrastructure and analytics and are excited about the opportunity to work with a talented team to deliver high-quality data products, we want to hear from you! Apply now to join our team as an Analytics Engineer.
Read more

About Recro

Founded
2014
Type
Size
100-1000
Stage
Profitable
About

Recro is a developer-focused platform that was founded with the aim of seamlessly matching individual expertise with the right opportunities.

We empower talented developers by providing them with relevant experience at fast-growing startups based on technical competencies and aspirations. These opportunities have a significant impact on their career success and help them become their best self.

 

On the other hand, startups get instant access to top-quality developers with guaranteed productivity from the very beginning. We help them to scale up/down based on their needs, thus ensuring an efficient and high-yielding workforce.

Developers solve real-time complex problems and get exposure to the uplifting and challenging work culture at start-ups like Flipkart, Dunzo, Swiggy, and Zivame among many others. At Recro, we ensure continuous support from our strong community to accelerate careers for developers and strive to create optimal business outcomes for high-growth startups.

Read more
Company video
Photos
Connect with the team
icon
Shifat S
icon
Jisha Emmanuel
icon
Samiksha Singh
icon
Ramu Gupta
icon
Sahana gowda
icon
Anita Rayan
icon
Alok Singh
icon
sana rihana
icon
Agnish Banerjee
icon
Moulidharan V S
icon
Ekta Kotian
icon
JeevithaKrishnaKumar
icon
Rahul Jain
icon
Mohit Arora
icon
Muskan Handa
icon
Ravi Jha
icon
B Sandeep Kumar
icon
Mariam Masood
icon
Sreha Prasad
icon
Ankur Bachchan
icon
Priyal Shenoy
icon
Tanisha Gagneja
icon
Nandhinee Gopal
icon
Aanchal Malik
icon
Amrita Singh
icon
Siddharth Wadhwani
icon
Deepak kumar
icon
Pruthvi Raj
icon
Rahul Ashok
icon
Mounashree JP
icon
Atul Rana
icon
Goutham Dashrath
Company social profiles
icon
icon
icon
icon
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

DP
Posted by Lakshmi J
Chennai
2 - 3 yrs
₹10L - ₹18L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+3 more

About the company:

VakilSearch is a technology-driven platform, offering services that cover the legal needs of startups and established businesses. Some of our services include incorporation, government registrations & filings, accounting, documentation and annual compliances. In addition, we offer a wide range of services to individuals, such as property agreements and tax filings. Our mission is to provide one-click access to individuals and businesses for all their legal and professional needs.

 

You can learn more about us at vakilsearch.com .

About the role:

A successful data analyst needs to have a combination of technical as well leadership skills. A background in Mathematics, Statistics, Computer Science, Information Management can serve as a solid foundation to build your career as a data analyst at VakilSearch.

 

Why to join Vakilsearch:

 

  • Unlimited opportunities to grow
  • Flat hierarchy
  • Encouraging environment to unleash your out of box thinking skills

 

Responsibilities:

 

  • Preparing reports for the stakeholders and the management, enabling them to take important decisions based on various facts and trends.
  • Using automated tools to extract data from primary and secondary sources
  • Identify and recommend the right product metrics to be analysed and tracked for every feature/problem statement.
  • Using statistical tools to identify, analyze, and interpret patterns and trends in complex data sets that could be helpful for the diagnosis and prediction
  • Working with programmers, engineers, and management heads to identify process improvement opportunities, propose system modifications, and devise data governance strategies.

 

Required skills:

 

  • Bachelor’s degree from an accredited university or college in computer science or graduate from data science related program
  • Minimum of 0 - 2 years experience in analysing
Read more
at RaRa Now
3 recruiters
DP
Posted by Shalini Yadav
Remote only
3 - 5 yrs
₹10L - ₹15L / yr
Go Programming (Golang)
Goroutine
Godep
Echo
API
+5 more

About Us :

  • RaRa Now is revolutionizing instant delivery for e-commerce in Indonesia through data-driven logistics.

  • RaRa Now is making instant and same-day deliveries scalable and cost-effective by leveraging a differentiated operating model and real-time optimization technology. RaRa makes it possible for anyone, anywhere to get same-day delivery in Indonesia. While others are focusing on - one-to-one- deliveries, the company has developed proprietary, real-time batching tech to do - many-to-many- deliveries within a few hours. RaRa is already in partnership with some of the top eCommerce players in Indonesia like Blibli, Sayurbox, Kopi Kenangan, and many more.

  • We are a distributed team with a company headquartered in Singapore, core operations in Indonesia, and a technology team based out of India.

Future of eCommerce Logistics :

  • Data-driven logistics company that is bringing in a same-day delivery revolution in Indonesia

  • Revolutionizing delivery as an experience

  • Empowering D2C Sellers with logistics as the core technology

About the Role :

  • Writing scalable, robust, testable, efficient, and easily maintainable code
  • Translating software requirements into stable, working, high performance software
  • Playing a key role in architectural and design decisions, building toward an efficient microservices distributed architecture.
  • Strong knowledge of Go programming language, paradigms, constructs, and idioms
  • Knowledge of language patterns such as - Goroutine and Channels
  • Experience with the full site of Go frameworks and tools, including :
  • Dependency management tools such as Godep.
  • Popular Go web frameworks, such as Echo
  • Request routing and API mechanisms
  • Ability to write clean and effective Godoc comments
  • Familiarity with code versioning tools - primarily Git.
  • A basic understanding of computing and Linux systems
  • Basic knowledge of Systems Engineering
  • Memory management and pointers, specifically in Golang
  • Implement Docker for smaller-scale applications that require simpler deployments
  • Employ Linux Terminal command structures to allow easy back-end operations for less-expert technical staff
  • Structure our user interface with React and ensure REST API access is available for enterprise-grade finance customers on-demand
Read more
DP
Posted by Sharon Joseph
Bengaluru (Bangalore), Gurugram, Chennai, Pune
7 - 10 yrs
Best in industry
Data Science
Machine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
Python
+1 more

Job Summary

As a Data Science Lead, you will manage multiple consulting projects of varying complexity and ensure on-time and on-budget delivery for clients. You will lead a team of data scientists and collaborate across cross-functional groups, while contributing to new business development, supporting strategic business decisions and maintaining & strengthening client base

  1. Work with team to define business requirements, come up with analytical solution and deliver the solution with specific focus on Big Picture to drive robustness of the solution
  2. Work with teams of smart collaborators. Be responsible for their appraisals and career development.
  3. Participate and lead executive presentations with client leadership stakeholders.
  4. Be part of an inclusive and open environment. A culture where making mistakes and learning from them is part of life
  5. See how your work contributes to building an organization and be able to drive Org level initiatives that will challenge and grow your capabilities.

​​​​​​Role & Responsibilities

  1. Serve as expert in Data Science, build framework to develop Production level DS/AI models.
  2. Apply AI research and ML models to accelerate business innovation and solve impactful business problems for our clients.
  3. Lead multiple teams across clients ensuring quality and timely outcomes on all projects.
  4. Lead and manage the onsite-offshore relation, at the same time adding value to the client.
  5. Partner with business and technical stakeholders to translate challenging business problems into state-of-the-art data science solutions.
  6. Build a winning team focused on client success. Help team members build lasting career in data science and create a constant learning/development environment.
  7. Present results, insights, and recommendations to senior management with an emphasis on the business impact.
  8. Build engaging rapport with client leadership through relevant conversations and genuine business recommendations that impact the growth and profitability of the organization.
  9. Lead or contribute to org level initiatives to build the Tredence of tomorrow.

 

Qualification & Experience

  1. Bachelor's /Master's /PhD degree in a quantitative field (CS, Machine learning, Mathematics, Statistics, Data Science) or equivalent experience.
  2. 6-10+ years of experience in data science, building hands-on ML models
  3. Expertise in ML – Regression, Classification, Clustering, Time Series Modeling, Graph Network, Recommender System, Bayesian modeling, Deep learning, Computer Vision, NLP/NLU, Reinforcement learning, Federated Learning, Meta Learning.
  4. Proficient in some or all of the following techniques: Linear & Logistic Regression, Decision Trees, Random Forests, K-Nearest Neighbors, Support Vector Machines ANOVA , Principal Component Analysis, Gradient Boosted Trees, ANN, CNN, RNN, Transformers.
  5. Knowledge of programming languages SQL, Python/ R, Spark.
  6. Expertise in ML frameworks and libraries (TensorFlow, Keras, PyTorch).
  7. Experience with cloud computing services (AWS, GCP or Azure)
  8. Expert in Statistical Modelling & Algorithms E.g. Hypothesis testing, Sample size estimation, A/B testing
  9. Knowledge in Mathematical programming – Linear Programming, Mixed Integer Programming etc , Stochastic Modelling – Markov chains, Monte Carlo, Stochastic Simulation, Queuing Models.
  10. Experience with Optimization Solvers (Gurobi, Cplex) and Algebraic programming Languages(PulP)
  11. Knowledge in GPU code optimization, Spark MLlib Optimization.
  12. Familiarity to deploy and monitor ML models in production, delivering data products to end-users.
  13. Experience with ML CI/CD pipelines.
Read more
Reputed MNC client of people first consultant
Agency job
Bengaluru (Bangalore), Mysore, Chennai, Pune
9 - 12 yrs
₹12L - ₹30L / yr
Amazon Web Services (AWS)
AWS Lambda
POC
Deignation: AWS Architect
Experience: 9-12 years
Location: Bangalore
Job Description

Strong Experience across Applications Migration to Cloud, Cloud native Architecture, Amazon EKS, Serverless (Lambda).​

Delivery of customer Cloud Strategies aligned with customers business objectives and with a focus on Cloud Migrations and App Modernization.​

Design of clients Cloud solutions with a focus on AWS.​

Undertake short-term delivery engagements related to cloud architecture with a specific focus on AWS and Cloud Migrations/Modernization.​

Provide leadership in migration and modernization methodologies and techniques including mass application movements into the cloud.​

Implementation of AWS within in large regulated enterprise environments.​

Nurture Cloud computing expertise internally and externally to drive Cloud Adoption.​

Work with designers and developers in the team to guide them through the solution implementation.​

Participate in performing Proof of Concept (POC) for various upcoming technologies to fit in business requirement. 
Read more
IT Company
Agency job
via Kalibre global konnects by Vimal Patel
Ahmedabad
1 - 6 yrs
₹3L - ₹7L / yr
ASP.NET
ASP.NET MVC
Web API
AJAX
MySQL
+5 more
• Job Title:- Asp.Net MVC Developer
• Job Location:- Opp. Sola over bridge, Ahmedabad
• Education:- B.E./ B. Tech./ M.E./ M. Tech/ MCA
• Desired Skills:- .Net 4.5 and later, Asp.Net MVC, C#, Java script, JQuery, CSS3, Web API, ANGULAR JS, .Net Core, Sql Server, Mysql, Entity Framework
• Experience:- 01yrs to 04yrs
• Number of Vacancy:- 02
• 5 Days working
• Job Timing:- 10am to 7:30pm

Roles & Responsibility:-

• Producing clean, efficient code based on specifications.
• Fixing and improving existing software.
• Integrate software components and third-party programs.
• Verify and deploy programs and systems.
• Troubleshoot, debug and upgrade existing software.
• Gather and evaluate user feedback.
• Recommend and execute improvements.
• Create technical documentation for reference and reporting

Job Requirement:-

• Must have good experience on Asp.Net, MCV, C#, Java script, Jquery etc.
• Experience with software design and development in a test-driven environment.
• Knowledge of coding languages (e.g. C#,VB, JavaScript) and frameworks/systems (e.g. AngularJS, Git).
• Experience with databases and Object-Relational Mapping (ORM).
• Ability to learn new languages and technologies.
• Excellent communication skills.

Regards, 
Vimal Patel
Read more
at CoStrategix Technologies
1 video
5 recruiters
DP
Posted by Jayasimha Kulkarni
Remote, Bengaluru (Bangalore)
4 - 8 yrs
₹10L - ₹28L / yr
Data engineering
Data Structures
Programming
Python
C#
+3 more

 

Job Description - Sr Azure Data Engineer

 

 

Roles & Responsibilities:

  1. Hands-on programming in C# / .Net,
  2. Develop serverless applications using Azure Function Apps.
  3. Writing complex SQL Queries, Stored procedures, and Views. 
  4. Creating Data processing pipeline(s).
  5. Develop / Manage large-scale Data Warehousing and Data processing solutions.
  6. Provide clean, usable data and recommend data efficiency, quality, and data integrity.

 

Skills

  1. Should have working experience on C# /.Net.
  2. Proficient with writing SQL queries, Stored Procedures, and Views
  3. Should have worked on Azure Cloud Stack.
  4. Should have working experience ofin developing serverless code.
  5. Must have MANDATORILY worked on Azure Data Factory.

 

Experience 

  1. 4+ years of relevant experience

 

Read more
at StatusNeo
6 recruiters
DP
Posted by Alex P
Gurugram, Bengaluru (Bangalore), Pune
2 - 15 yrs
₹10L - ₹35L / yr
Scala
PySpark
Data engineering
Big Data
Hadoop
+3 more

Data Engineer – SQL, RDBMS, pySpark/Scala, Python, Hive, Hadoop, Unix

 

Data engineering services required:

  • Build data products and processes alongside the core engineering and technology team;
  • Collaborate with senior data scientists to curate, wrangle, and prepare datafor use in their advanced analytical models;
  • Integrate datafrom a variety of sources, assuring that they adhere to data quality and accessibility standards;
  • Modify and improve data engineering processes to handle ever larger, more complex, and more types of data sources and pipelines;
  • Use Hadoop architecture and HDFS commands to design and optimize data queries at scale;
  • Evaluate and experiment with novel data engineering tools and advises information technology leads and partners about new capabilities to determine optimal solutions for particular technical problems or designated use cases.

 

Big data engineering skills:

  • Demonstrated ability to perform the engineering necessary to acquire, ingest, cleanse, integrate, and structure massive volumes of data from multiple sources and systems into enterprise analytics platforms;
  • Proven ability to design and optimize queries to build scalable, modular, efficient data pipelines;
  • Ability to work across structured, semi-structured, and unstructured data, extracting information and identifying linkages across disparate data sets;
  • Proven experience delivering production-ready data engineering solutions, including requirements definition, architecture selection, prototype development, debugging, unit-testing, deployment, support, and maintenance;
  • Ability to operate with a variety of data engineering tools and technologies
Read more
E commerce & Retail
Agency job
via Myna Solutions by Venkat B
Chennai
5 - 10 yrs
₹8L - ₹18L / yr
Machine Learning (ML)
Data Science
Python
Tableau
SQL
+3 more
Job Title : DataScience Engineer
Work Location : Chennai
Experience Level : 5+yrs
Package : Upto 18 LPA
Notice Period : Immediate Joiners
It's a full-time opportunity with our client.

Mandatory Skills:Machine Learning,Python,Tableau & SQL

Job Requirements:

--2+ years of industry experience in predictive modeling, data science, and Analysis.

--Experience with ML models including but not limited to Regression, Random Forests, XGBoost.

--Experience in an ML engineer or data scientist role building and deploying ML models or hands on experience developing deep learning models.

--Experience writing code in Python and SQL with documentation for reproducibility.

--Strong Proficiency in Tableau.

--Experience handling big datasets, diving into data to discover hidden patterns, using data visualization tools, writing SQL.

--Experience writing and speaking about technical concepts to business, technical, and lay audiences and giving data-driven presentations.

--AWS Sagemaker experience is a plus not required.
Read more
Remote, Bengaluru (Bangalore), Hyderabad
0 - 1 yrs
₹2.5L - ₹4L / yr
SQL
Data engineering
Big Data
Python
● Hands-on Work experience as a Python Developer
● Hands-on Work experience in SQL/PLSQL
● Expertise in at least one popular Python framework (like Django,
Flask or Pyramid)
● Knowledge of object-relational mapping (ORM)
● Familiarity with front-end technologies (like JavaScript and HTML5)
● Willingness to learn & upgrade to Big data and cloud technologies
like Pyspark Azure etc.
● Team spirit
● Good problem-solving skills
● Write effective, scalable code
Read more
at Blok
1 recruiter
DP
Posted by Shoa Khan
Bengaluru (Bangalore)
3 - 8 yrs
₹10L - ₹25L / yr
PostgreSQL
Java
Scala
C++
Javascript
+1 more

Who We Are

Getir is a technology company, a pioneer of the online ultra-fast grocery delivery service business, that has transformed the way in which millions of people across the world consume groceries. We believe in a world where getting everything you need, when you need it, sustainably, is the new normal. 

Getir is growing incredibly fast in Europe, but we want to grow globally. From London to Tokyo, Sao Paulo to New York, our global ambitions can only be accomplished with exceptional technology.

If you've got the experience and the ambition to be a Database Administrator (Postgres) at Getir and  a founding part of our technology hub in Bangalore, please apply.


What you’ll be doing:

  • Work with engineering and other teams to build & maintain our database requirements and answer big data questions
  • Use your past DBA experience and industry wide best practices to scale and optimize the database services.
  • Regularly conduct database health monitoring and diagnostics
  • Create processes to ensure data integrity and identify potential data errors.
  • Document and update procedures and processes.
  • Troubleshoot and resolve problems as they arise

What we look for in you:

  • You have A Bachelor’s degree in Computer Science, Computer Engineering, Data Science, or another related field.
  • 3+ years of experience as a Database Administrator
  • Proficiency administering PostgreSQL
  • Extensive experience performing general troubleshooting database maintenance activities including backup and recovery, capacity planning, and managing user accounts.
  • Experience in identifying and documenting risk areas and mitigation strategies for process and procedure activities.
  • Experience in managing schemas, indexing, objects, and partitioning the tables.
  • Experience in managing system configurations.
  • Experience creating data design models, database architecture, and data repository design.
  • Strong understanding of SQL tuning and optimization of query plans.
  • Linux shell scripting skills and experience with production Linux environments
  • Experience working with software engineers in a highly technical environment.
  • Knowledge of 1+ programming language (e.g. C++, Scala, Java, JavaScript etc.)
  • Excellent verbal and written communication skills.
  • Knowledge administering MongoDB (Good to have)
  • Knowledge administering Amazon RDS for PostgreSQL & Redshift. (Good to have)
Read more
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at Recro?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort