We are looking for a Data Engineer that will be responsible for collecting, storing, processing, and analyzing huge sets of data that is coming from different sources.
Responsibilities
Working with Big Data tools and frameworks to provide requested capabilities Identify development needs in order to improve and streamline operations Develop and manage BI solutions Implementing ETL process and Data Warehousing Monitoring performance and managing infrastructure
Skills
Proficient understanding of distributed computing principles Proficiency with Hadoop and Spark Experience with building stream-processing systems, using solutions such as Kafka and Spark-Streaming Good knowledge of Data querying tools SQL and Hive Knowledge of various ETL techniques and frameworks Experience with Python/Java/Scala (at least one) Experience with cloud services such as AWS or GCP Experience with NoSQL databases, such as DynamoDB,MongoDB will be an advantage Excellent written and verbal communication skills

About Yulu Bikes
About
Company video


Photos
Connect with the team
Company social profiles
Similar jobs

Role Proficiency:
This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.
Skill Examples:
- Proficiency in SQL Python or other programming languages used for data manipulation.
- Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
- Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
- Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
- Experience in performance tuning.
- Experience in data warehouse design and cost improvements.
- Apply and optimize data models for efficient storage retrieval and processing of large datasets.
- Communicate and explain design/development aspects to customers.
- Estimate time and resource requirements for developing/debugging features/components.
- Participate in RFP responses and solutioning.
- Mentor team members and guide them in relevant upskilling and certification.
Knowledge Examples:
- Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.
- Proficient in SQL for analytics and windowing functions.
- Understanding of data schemas and models.
- Familiarity with domain-related data.
- Knowledge of data warehouse optimization techniques.
- Understanding of data security concepts.
- Awareness of patterns frameworks and automation practices.
Additional Comments:
# of Resources: 22 Role(s): Technical Role Location(s): India Planned Start Date: 1/1/2026 Planned End Date: 6/30/2026
Project Overview:
Role Scope / Deliverables: We are seeking highly skilled Data Engineer with strong experience in Databricks, PySpark, Python, SQL, and AWS to join our data engineering team on or before 1st week of Dec, 2025.
The candidate will be responsible for designing, developing, and optimizing large-scale data pipelines and analytics solutions that drive business insights and operational efficiency.
Design, build, and maintain scalable data pipelines using Databricks and PySpark.
Develop and optimize complex SQL queries for data extraction, transformation, and analysis.
Implement data integration solutions across multiple AWS services (S3, Glue, Lambda, Redshift, EMR, etc.).
Collaborate with analytics, data science, and business teams to deliver clean, reliable, and timely datasets.
Ensure data quality, performance, and reliability across data workflows.
Participate in code reviews, data architecture discussions, and performance optimization initiatives.
Support migration and modernization efforts for legacy data systems to modern cloud-based solutions.
Key Skills:
Hands-on experience with Databricks, PySpark & Python for building ETL/ELT pipelines.
Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).
Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).
Experience with data modeling, schema design, and performance optimization.
Familiarity with CI/CD pipelines, version control (Git), and workflow orchestration (Airflow preferred).
Excellent problem-solving, communication, and collaboration skills.
Skills: Databricks, Pyspark & Python, Sql, Aws Services
Must-Haves
Python/PySpark (5+ years), SQL (5+ years), Databricks (3+ years), AWS Services (3+ years), ETL tools (Informatica, Glue, DataProc) (3+ years)
Hands-on experience with Databricks, PySpark & Python for ETL/ELT pipelines.
Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).
Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).
Experience with data modeling, schema design, and performance optimization.
Familiarity with CI/CD pipelines, Git, and workflow orchestration (Airflow preferred).
******
Notice period - Immediate to 15 days
Location: Bangalore
About us:
We are not just an AD agency or a creative agency, we are a Communication Company. Founded in 2014, Moshi Moshi is a young, creative, gutsy and a committed communication company that wants its clients to always Expect the EXTRA from it.
Our primary clientele consists of Startups & Corporates like Ola, Zoomcar, Mercedes Benz, ITC, Aditya Birla Group, Colive, MTV, Toit, IHCL, Jaquar, Sobha, Simple Energy, and Godrej amongst others. We have a huge team of creative folks, marketers, learners, developers, coders and a puppy momo, who believe Moshi Moshi is an experience rather than a company.
Why Moshi Moshi? The learning curve at Moshi Moshi is very high when compared to the industry average and that's primarily because you get to work with Companies / Brand managers / Marketers of different sizes and thought processes who push you to think better and faster. So, Hop on to the ride we dearly call Moshi Moshi and let's Say hello to the world.
PS:- We are also very close to a lot of food joints and breweries, so in case your manager/boss gives you a lot of work or is Expecting the Extra, you can quickly grab a recharge and continue with your everyday life struggles. We can't do much about the manager!
Join our team as a Social Media Cinematographer Intern!
Gain hands-on experience in cinematography while contributing to our vibrant social media presence. Collaborate with our creative team to produce engaging video content for various social media platforms. Assist in camera operation, lighting setups, and post-production tasks. Ideal candidates are passionate about filmmaking, possess technical skills in camera operation and editing software, and thrive in a fast-paced environment. Enrolled in or recent graduate of a film production program preferred.
This is a 3 month internship with possible stipend or academic credit.
Apply with resume and portfolio.
Join us and make your mark in the exciting world of social media storytelling!
Need IMMEDIATE JOINERS ONLY.
Candidate must have Experience with Language React, Angular,MobX, and SCSS .
Candidate should have Hands-on experience working with RDBMS
Candidate should have developed from scratch.
Candidate should be more stronger in Angular.
Candidate from service based companies will be preferred.
Job description
Daily Doc Technologies LLC https://dailydoc.io was conceived to innovate and bring cutting edge technology in medicine. Our mission is to make patient care more efficient, effortless and minimise medical errors. We focus on bringing useful IT solutions in medicine.
With advancements in technology, communication in healthcare can be made seamless and effortless. Lack of effective communication is one of the main causes of medical errors and unwanted outcomes. Daily Doc Healthcare App brings the technology in today's complex medical environment to give healthcare providers tools needed to have effortless, reliable and secure communication. Designed by doctors and nurses, we strive to make our platform better every day. Honesty and Integrity are our core values. We strive to innovate in healthcare to bring about positive meaningful changes in peoples lives.
Preferred Experience:
- 2+ years of experience working with mobile development.
- At least 1 to 2 years experience in Flutter Development.
- Should have knowledge about chat applications and technologies like Socket.io and Websockets are highly preferred.
- Deployed at least 3 complete apps with REST APIs linked.
- Cross-platform mobile app developers who have developed mobile apps with familiarity with Flutter
- Have experience with Flutter for both iOS and Android. Knowledge of native technologies is a bonus. • Familiarity with linking RESTful APIs.
- Knowledge of modern authorisation mechanisms, such as JSON Web Token.
- Ability to understand business requirements and translate them into technical requirements.
- Firebase Auth, Dart, Bloc, Cubit, MVC, Socket.io, Websockets, Providers, Network Callm Web Support, Offline Apps, Local Storage (or Sqflite), Google Maps API, Google Material Design are the preferred tech stack.
- Know how to deal with different screen sizes.
- Experience with version control such as Git and GitHub.
- Native android requirements like Kotlin, XML, Android Life Cycle, crash reporting tools and usage tracking tools are a bonus
Responsibilities
- Design and Build sophisticated and highly scalable apps using Flutter.
- Translate and build the designs into high quality responsive UI code.
- Write efficient queries for core Data.
- Resolve any problems existing in the system and suggest and add new features in the complete system.
- Follow the best practices while developing the app.
- Document the project and code efficiently.
- Manage the code and project on Git in order to keep in sync with other team members and managers.
- Knowledge of different state management libraries like BloC, GetX,
About the company
Founded in - 2018
Website https://dailydoc.io
Total Employes- 5
Job Types: Full-time, Permanent
Salary: ₹300,000.00 - ₹1,000,000.00 per year
Speak with the employer
96-99-56-97-85
We are looking for a savvy Data Engineer to join our growing team of analytics experts.
The hire will be responsible for:
- Expanding and optimizing our data and data pipeline architecture
- Optimizing data flow and collection for cross functional teams.
- Will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.
- Must be self-directed and comfortable supporting the data needs of multiple teams, systems and products.
- Experience with Azure : ADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with object-oriented/object function scripting languages: Python, SQL, Scala, Spark-SQL etc.
Nice to have experience with :
- Big data tools: Hadoop, Spark and Kafka
- Data pipeline and workflow management tools: Azkaban, Luigi, Airflow
- Stream-processing systems: Storm
Database : SQL DB
Programming languages : PL/SQL, Spark SQL
Looking for candidates with Data Warehousing experience, strong domain knowledge & experience working as a Technical lead.
The right candidate will be excited by the prospect of optimizing or even re-designing our company's data architecture to support our next generation of products and data initiatives.
● Ability to design and implement and maintain highly complex systems and
subsystems.
● Writing well-designed, testable and efficient code.
● Designing and developing advanced applications for the Android platform.
● Working as a part of a dynamic team to deliver winning products.
● Troubleshoot, debug and optimize existing applications
.
Requirements:
● Bachelor's degree in Computer Science, related technical field or equivalent
practical experience
● Strong logical and analytical skills
● Should be adaptable and fast learner
● Experience in computer science, data structures, algorithms and software design.
● Experience in Software Development and coding in any general purpose
programming language.
● Should have an interest in android development
Location: Noida
Tech9 India is looking for Software Developers to join our team. We have mid level to principal roles available. Depending on the role, the position requires the use of Front and back End Technologies. This is a great opportunity to work with a company that has a primary focus of making our customers happy by delivering value, without all the burdensome policies and rules that have become typical for outsourced software development companies.
If you are looking for a change this is what we can promise you:
- You will have challenging problems to solve
- You will have flexibility and autonomy to solve problems and deliver solutions
- We will provide a highly collaborative environment with skilled and super friendly teammates
- We will fully support you in developing software the right way
- We won't burden you with useless policies and procedures
- We will provide you the tools you need to do your job right
If this sounds attractive please apply! We'd love to talk to you!
Responsibilities:
- Work with cross-functional teams, using agile practices, write, debug and deliver code
- Produce solid, thoroughly tested features
Requirements:
- 3+ years of experience in Javascript engineering, frameworks, unit testing frameworks
- At least 3 years of experience developing in Node, React, and nice to have experience on Go Language
- Strong verbal and written communication skills
- Experience working with product managers and scrum teams to effectively translate business requirements into workflows
- Demonstrated ability to remove problems and overcome roadblocks
- Cloud platform experience
Good to have for Mid/Sr. and must have for Principal role
- Hands-on experience with APIs, microservices
- Experience with CI/CD systems such as Jenkins
- Experience with JSON and developing REST Services
- Expert experience leveraging container based technologies
- Multiple years of experience delivery solutions through an Agile delivery methodology
- Experience with bash shell scripts, LINUX utilities & UNIX Commands
- Ability to understand complex systems and solve challenging analytical problems
- Ability to leverage multiple tools and programming languages to analyze and manipulate data sets from disparate data sources
- Strong collaboration and communication skills within and across teams
- Strong problem solving skills and critical thinking ability
We are looking for a senior graphic designer with a keen eye for detail and aesthetics to join our team. The role would include designing social media posts, banners, email layouts, page layouts etc
The role can be fulfilled both remotely and in-person. Our office is based out of Santacruz West in Mumbai
- Strong conceptualization ability, strong visual communication ability, and creative skills.
- Understanding of contemporary user experience planning, interaction design, and visual design.
- Candidate must absolutely be in sync with recent design standards and trends.
- Work experience in designing UI for mobile, web and desktop-based applications
- Experience with HTML, CSS, JavaScript, jQuery a plus
- Expertise in Creating Pixel Perfect HTML/ CSS/Theme a plus
- Understanding and experience with responsive and multi-device design.
- Strong working knowledge of Photoshop and associated design tools.
- Working with Content management systems
- Editing content, debugging code and re-designing webpages
• Excellent understanding of machine learning techniques and algorithms, such as SVM, Decision Forests, k-NN, Naive Bayes etc.
• Experience in selecting features, building and optimizing classifiers using machine learning techniques.
• Prior experience with data visualization tools, such as D3.js, GGplot, etc..
• Good knowledge on statistics skills, such as distributions, statistical testing, regression, etc..
• Adequate presentation and communication skills to explain results and methodologies to non-technical stakeholders.
• Basic understanding of the banking industry is value add
Develop, process, cleanse and enhance data collection procedures from multiple data sources.
• Conduct & deliver experiments and proof of concepts to validate business ideas and potential value.
• Test, troubleshoot and enhance the developed models in a distributed environments to improve it's accuracy.
• Work closely with product teams to implement algorithms with Python and/or R.
• Design and implement scalable predictive models, classifiers leveraging machine learning, data regression.
• Facilitate integration with enterprise applications using APIs to enrich implementations













